Skip to main content

Ask DT: Our answers to this week’s reader questions

Welcome to the inaugural Ask DT post, where we’ll be answering reader questions you’ve sent in. This week we’ve got a rather technical question about chargers, some ways to placate your Timeline troubles, and the age-old developer query about whether to go Android or iOS. Check out our answers, and be sure to keep sending in your questions to askdt@digitaltrends.com. 

The need for a voltage converter when abroad

Q: I recently moved from the U.S. to Europe, and I’ve been using a converter/adapter (one that changes the voltage) to power my laptop. I plug the U.S. power cable into the converter, and then plug that, with the adapter prongs for Europe, into the wall outlet. Am I OK continuing to do this, or is the risk of a power surge significant? Thank you!  — Sent by “K”

Recommended Videos

Nick MokeyexampleGood news: Chances are, you probably don’t need the voltage converter you’re using at all. Almost all laptops will already run quite happily on 120 volts in the States, or the 240-volt grid in Europe. The only adapter you need is to convert the flat US prongs to the rounded European ones — like the one at right.

Why? The laptop’s power supply — that’s the annoying brick in the cord — is already doing the task of stepping wall voltage down to what the computer needs to run, so your other voltage converter is redundant. You’ll actually save electricity (and have one less hot plastic brick at your feet) if you ditch it for a simple plug adapter.

Personally, I’ve never seen a supply that won’t work on both continents, but to make sure absolutely sure before you send your laptop up in flames, look at all the fine print on the back. If you see something like “100-240V 50-60Hz,” listed under “input,” you’re good. Just get ready to squint. It’s the highlighted text in the photo below:

laptop power supply input voltage highlightedAs for power surges, buy a standard surge protector at a shop in Germany and use your adapter in one of the outlets it offers. Just keep in mind that the cheapest models only offer basic protection, so if peace of mind is important, spring for something with a warranty.

Timeline troubles

Q: Help! I have managed to remove Timeline from Google Chrome and Mozilla Firefox, however I have the Timeline on my mobile app! Is there anyway to remove this? There are way too many glitches and it keeps refreshing and just sucks altogether! Please any suggestions? Hard to believe with so many folks hating the new Timeline that Facebook wouldn’t allow you to have a choice. Any ideas would be great! I’d love to get it off my mobile app! — Sent by “B”

Molly McHugh: Unfortunately, if you’re using the official Facebook app, you’re out of luck. When you’re accessing Facebook via desktop browser, there are lots of toolbar extensions that can get rid of Timeline for you, but no such hack exists for the mobile experience.

However you do have another option, in the form of alternative Facebook apps. Now none of these are going to look exactly like old Facebook did for you, but if you’re seriously bothered by Timeline then they might be a suitable relief.

friendcasterFor Android, Friendcaster is a good option. It’s a familiar UI but gets rid of that timeline bar that’s irking you. Seesmic is another anti-Facebook Facebook mobile tool, although the color scheme and formatting might be too much of a contrast for most. The latter of these two also has an iPhone app which is more of a social networking hub than a plain old Facebook alternative, but you might find it’s a better fit. 

ace for facebookIf you’re an iPhone user, check out Ace for Facebook. Other options include Ultimate for Facebook (a little emoticon-heavy for my tastes) as well as Facely HD

Or you could join Path.

Developer dilemma: iPhone or Android? 

Q: To make this short I am an aspiring mobile developer and I have this weird dilemma. For the past few weeks I have read dozens of articles complaining on Android’s fragmentation and how that is discouraging developers on it’s future. On the other hand I also read that Android has more freedom than iOS and it already has a strong hold on a large chunk of the market. I hope in the future to develop for more than just Android or iOS but for now I’d like some guidance into which operating system to try first. Any helpful opinions? Maybe somebody has experience in developing for Android and iOS. Thanks. — Sent by “F”

Jeffrey Van CampThanks for the question. I am not an active developer myself, but  I do follow this stuff closely. It all depends on what kind of app you’re trying to make and what your goals are with it. If you want to program an app once and deliver it to the most phones without much hassle, I’d lean toward the iPhone (30-percent or so marketshare). Apple is sometimes difficult to deal with. I hope you’re not trying to write an app that looks like iBooks, for example. But the App Store is where a lot of apps get their start. With iPhone, you basically only have to write for one phone (and a few older models of that device). If you want to charge money for an app, the iPhone has a userbase that is far more willing to spend a few bucks on an app than Android. 

iOS vs AndroidOn the other hand, if you simply want to reach the most users possible, Android might be your best bet (50-percent or so marketshare). If your app takes advantage of device-specific features, be prepared for a bit of a nightmare because every Android device is different and has different quirks, screen sizes, processors, and features. For example, if you are making a camera app, you have to realize that a portion of Android phones have physical shutter buttons while others don’t. Android users also don’t like to pay for things. If your app is ad supported, you could see a big uptake though.

Apple also requires you to use their tools and follow their style-guides, while Android development is a bit more lax. Finally, the iPhone marketplace is full of a lot of very good apps. With Android, there are just as many apps, if not more, but the ratio of good to crap leans in favor of crappy apps. So if you have great UI, you could stand out there better.

Honestly, in time, you will want to be on both. If Windows Phone takes off and BlackBerry survives through 2013, you’ll probably want to consider those as well. I hope that helps!

Digital Trends Staff
Digital Trends has a simple mission: to help readers easily understand how tech affects the way they live. We are your…
This is the GPU I’m most excited for in 2025 — and it’s not by Nvidia
The AMD Radeon RX 7900 XTX graphics card.

The next few months will completely redefine every ranking of the best graphics cards. With Nvidia's RTX 50-series and AMD's RDNA 4 most likely launching in January -- and even Intel possibly expanding its Battlemage lineup -- there's a lot to look forward to.

But as for me, I already know which GPU I'm most excited about. And no, it's not Nvidia's rumored almighty RTX 5090. The GPU I'm looking forward to is AMD's upcoming flagship, which will presumably be the RX 8800 XT (or perhaps the RX 9070 XT). Below, I'll tell you why I think this GPU is going to be so important not just for AMD but also for the entire graphics card market.
Setting the pace

Read more
Google Street View camera captures highly suspicious act, leading to arrests
The Google Street View image showing someone loading a large bundle into the trunk of a car.

Imagery from Google’s Street View has reportedly helped to solve a murder case in northern Spain.

Street View is the online tool that lets you view 360-degree imagery captured by cameras mounted on Google’s Street View cars that travel the world.

Read more
AMD’s RDNA 4 may surprise us in more ways than one
AMD RX 7800 XT and RX 7700 XT graphics cards.

Thanks to all the leaks, I thought I knew what to expect with AMD's upcoming RDNA 4. It turns out I may have been wrong on more than one account.

The latest leaks reveal that AMD's upcoming best graphics card may not be called the RX 8800 XT, as most leakers predicted, but will instead be referred to as the  RX 9070 XT. In addition, the first leaked benchmark of the GPU gives us a glimpse into the kind of performance we can expect, which could turn out to be a bit of a letdown.

Read more