UPDATE FIXES VERIZON IPHONE 5 DATA GLITCH; CUSTOMERS WON'T BE CHARGED FOR OVERAGES

Verizon iPhone 5 customers may have noticed an issue wherein their phones gobbled up extra cellular data when they were theoretically connected to Wi-Fi networks. Those customers now have two bits of good news: There’s a special software update that fixes the problem, and they won’t be responsible for unexpected charges related to unintended network overages related to the issue that spurred the carrier update in the first place.

10 HOT IT SKILLS FOR 2013

The number of companies planning to hire tech professionals continues to grow, with 33% of the 334 IT executives who responded to Computerworld's 2013 Forecast survey saying they plan to increase head count in the next 12 months..

APPLE WARNS ICLOUD USERS OF LOOMING STORAGE LOSS

Apple on Monday began reminding some iCloud users that they will soon lose the 20GB of free storage they'd received when they migrated from MobileMe.

Nook Video set for fall premier

Barnes and Noble Tuesday announced that Nook Video will premiere this fall in the U.S. and UK. The service will offer access to movies and TV shows for streaming and download.

Eight simple steps to make the upgrade to iPhone 5 easier

A little planning can save time - and voice messages - when you upgrade to the new iPhone 5

Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

Saturday, November 3, 2012

Evolution of the Keyboard


When Bill Buxton worked at Xerox's Palo Alto Research Center in the early 1990s, he examined the classic children's homemade telephones: two cups connected by a taut string. He wondered why that same concept couldn't improve computer keyboards.
Think about it. The cup is both a microphone and a speaker. It uses the same "hardware" for input and output of sound. Why, Buxton asked, couldn't the same principle apply to text on computers—using a single device for both input and output of text rather than using input from a keyboard to produce output on a screen?
Buxton wasn't alone in recognizing an eventual fusion of the two. Fast-forward a couple decades—and add myriad researchers and huge corporate R&D budgets—and we have touch-screen keyboards on tablets and smartphones. Inputs and outputs share the same surface. The keyboard has fused with the screen, at least for some computing tasks.
But as anyone who's typed on a virtual keyboard—or yelled at a voice-control app like Siri—can attest, no current text input holds a candle to a traditional computer keyboard when it comes to comfort, speed and accuracy. Maybe eventually we'll connect computers to our neurons, but in the meantime, the simple yet highly functional electromechanical keyboard will be around -- and keep improving -- for some time.
Decades after its introduction in the mid-'80s,
IBM's classic Model M remains a favorite
for keyboard purists.
Buxton, now a design guru at Microsoft Research, still closely examines old keyboards for forgotten tricks and technologies that could spawn new ways of thinking about how we enter information into a computer.
"Many of the great discoveries are right under our noses," he says when discussing the future of the keyboard. "A lot of the stuff that's emerging as new is rooted in things that have happened in the past -- and in some cases the really distant past."
Before we look at where computer keyboards might go in the future, then, let's look at where they've been.

Keying up the past

The evolution of the keyboard is not a clean timeline. Contributions to its look, feel and underpinning technologies sometimes draw from preceding models and other times from a far corner of the inventor's universe.
The first devices we'd recognize as related to modern keyboards date from the 19th century. In 1852 John Jones patented a "mechanical typographer," and 15 years laterChristopher Sholes received a patent for a "type-writing machine" -- what is usually considered the original typewriter. Some aspects of even these very early keyboards inform a lot about the design today.
"The typewriter [keyboard] had all sorts of functions. The shift key was really big because you needed a big surface area to push down and raise the carriage up," says David Hill, vice president of design and user experience at computer manufacturer Lenovo. "There was a mechanical advantage required."
Early computer keyboards mimicked
the feel of IBM's classic Selectric typewriter.
As far as direct influences on the modern computer keyboard, IBM's Selectric typewriter was one of the biggest. IBMreleased the first model of its iconic electromechanical typewriter in 1961, a time when being able to type fast and accurately was a highly sought-after skill.
Dag Spicer, senior curator at the Computer History Museum, notes that as the Selectric models rose to prominence, admins grew to love the feel of the keyboard because of IBM's dogged focus on making the ergonomics comfortable. "IBM's probably done more than anyone to find [keyboard] ergonomics that work for everyone," Spicer says. So when the PC hit the scene a decade or two later, the Selectric was largely viewed as the baseline to design keyboards for those newfangled computers you could put in your office or home.
In the late 1970s, companies like Cherry, Key Tronic and the Micro Switch division of Honeywell took off with their own approaches to mimicking the mechanical feel of a typewriter with the circuitry of a computer keyboard. "It was a real big deal back then," says Craig Gates, CEO of KeyTronicEMS, as the company is now called. "How the [keyboard] felt, how reliable it was, what speed could be achieved with a certain design of the switch."

Early switch designs

One of the first computer keyboard designs from the early '70s incorporated reed switches, which work with a magnet and two metal filaments. When the magnetic field gets close enough, it pulls the two filaments together and thus completes a circuit -- or, in the case of a key, a keystroke.
This Key Tronic keyboard used reed switches to
record keystrokes.
These keyboards housed circuit boards with 100 to 120 reed switches, each covered by a key. Underneath each key top was a tiny magnet. When someone depressed the key, the magnet made the filaments touch, thus generating an electrical signal for the desired character to type.
But filaments are fragile. (If you've dealt with busted holiday lights, you know this.) So these reed switch keyboards weren't reliable, Gates explains. If one broke or got out of alignment, or if dust obstructed the contact points, the key wouldn't work anymore—and, unlike holiday lights, individual keys weren't easy to replace.
In addition, they were subject to microvibrations that opened and closed the switch a few times in a single keystroke, thus tricking the computer into thinking the letter had been pressed several times successively. (Microvibrations are still an issue in some keyboards, but microprocessors filter them out.)
KeyTronic shows off the layers of its capacitive keyboards f
rom the late '80s and early '90s.
So in the late '70s, Gates says, reed switches began to give way to keys that relied on a magnetic principle called the Hall Effect. These keyboards, made by Micro Switch and others, didn't use physical contact points to complete a keystroke—instead they used magnetism, which can be less precise (and thus less liable to error) and doesn't require as many moving parts.
Meanwhile, Key Tronic, keen to get away from reed switches, developed the capacitive switch, which worked by putting a little bit of aluminum under the key top. When the key was depressed, that foil changed the capacitance of the circuit board underneath and a microprocessor registered a keystroke. This idea was soon improved upon with membrane keyboards, which simplified the capacitance mechanisms under the key and brought down production costs.

Trimming hardware, cutting costs

Though the materials sound cheap, keyboards were expensive in the early '80s. The typical keyboards Key Tronic and Micro Switch sold to computer makers ran about $100, as opposed to three or four bucks for the typical OEM keyboard today. To cut costs in a fiercely competitive market, keyboard manufacturers began to look for ways to cut hardware from the key while ensuring that the key tops, key weights, balance, foundation and "distance to travel"—the space it takes to register a keystroke—were familiar to users' fingers.
This required evaluating the hardware that makes the key move up and down. The "snap point" is one of the most important concepts that govern a keystroke, according to Aaron Stewart, a Lenovo senior design engineer reportedly nicknamed "Mr. Keyboard." This is the point where the key pops, your brain registers you've typed a letter and you pull back your finger. Think back to the first time you typed on a touch screen—remember the shock of not having the snap point?
Patented in 1978, the buckling spring key
mechanism drove IBM's popular PC keyboards for 
Additionally, keyboard makers have to consider the "break force" of the key, which has to provide enough resistance to allow your fingers to rest on the key top without inadvertently depressing it, but also needs to be weak enough to let you type without feeling like you're punching through a membrane with each keystroke.
In 1978, IBM received a patent for a "buckling spring" key mechanism that mimicked the feel of the old Selectrics. The mechanism worked with a small spring attached to non-parallel surfaces under the keycap.
The spring coiled normally when depressed but "buckled" to the side at the snap point due to the non-parallel surfaces of attachment -- and created the familiar click-clack sound of IBM's popular Model M keyboard and other old keyboards. The buckled portion of the spring activated the circuit, which generated the keystroke.

In a rubber-dome keyboard (shown upside down), the key caps
push down on the domes, which collapse, closing circuits
 and recording keystrokes, and then snap back.
But cost cutting gave way to newer ways of suspending the key by IBM and other manufacturers. Rubber domes, which work with the same snapping principle as a toilet plunger, and scissor switches, which also have a rubber dome but use a scissoring mechanism attached to the key top to push down the dome, came to prominence in the late '80s and early '90s.
Part of the goal of the new designs was to reduce the distance of travel. Comfort and speed when typing depend on the distance of travel for the key on each stroke. Shaving off precious fractions of millimeters improved the typing experience for many users.
"Compared to historical examples, today's desktops and notebooks have roughly 40 percent less [distance to travel]," Lenovo's Hill points out. Typing on rubber-dome and scissor-switch keyboards is usually quieter as well.
The scissoring mechanism used in scissor-switch keyboards
shortens the distance a key must travel to record a stroke.
These designs were also cheaper to produce, pushing keyboards to commodity status, according to Gates, and these two types of springs still underpin most of the computer keyboards on the market. Today the low-profile scissor-switch keys are typically found in notebooks and thin keyboards, including the chiclet-style keyboards on Apple's laptops. The taller rubber-dome keys are typically found in standard desktop keyboards and use an interlocking "chimney" structure in place of a scissors to stabilize the key travel.
As with any bygone technology, though, there are still enthusiasts who swear by the old IBM buckling springs. Indeed, keyboards with mechanical switches have undergone something of a renaissance in recent years as users pine for their crisp tactile feedback.

Thinner, lighter...one-handed?

Today, making a thin laptop with a great keyboard is no easy task. Designers run through a startling amount of math and habit analysis to arrive at very precise distances and positions between keys, which need to be exactly where our brain expects them or we'll type more slowly and make more errors. Meanwhile, ergonomic factors must be weighed against dimensions, weight and other practical design considerations, explains Lenovo's Stewart.
Ultrabooks like the Vizio Thin + Light strive for a
 top-grade keyboard in an ultraslim profile
Dish-shaped key tops guide the finger to the center of the key, but the concave shape makes it trickier to keep a laptop thin. A keyboard requires a solid foundation, but the additional material for a good base can add weight. The other side of the coin is that reducing the amount of materials in the keyboard frees space for microprocessors and a bigger battery. Stewart calls the sum of all of these design factors a moving target.
Manufacturers are constantly trying to cut costs and make the keyboard smaller—yet people want a consistency from their keyboards. It's the foundation of their interface with the computer. A company can tweak all the mechanisms or circuitry under the keycap, but if it makes for a poor typing experience, people won't buy the product. Keyboard manufacturers have to weigh the value of innovation against the ergonomic impact.
For now, Stewart believes that range of innovation extends only to the space under the keycap. "With the technology we have today, we think there is a finite limit of being able to create [thin, high-quality keyboards]," he says.
Synaptics says its new ThinTouch key technology
will mean stunningly thin keyboards.
Dish-shaped key tops guide the finger to the center of the key, but the concave shape makes it trickier to keep a laptop thin. A keyboard requires a solid foundation, but the additional material for a good base can add weight. The other side of the coin is that reducing the amount of materials in the keyboard frees space for microprocessors and a bigger battery. Stewart calls the sum of all of these design factors a moving target.
Manufacturers are constantly trying to cut costs and make the keyboard smaller—yet people want a consistency from their keyboards. It's the foundation of their interface with the computer. A company can tweak all the mechanisms or circuitry under the keycap, but if it makes for a poor typing experience, people won't buy the product. Keyboard manufacturers have to weigh the value of innovation against the ergonomic impact.
For now, Stewart believes that range of innovation extends only to the space under the keycap. "With the technology we have today, we think there is a finite limit of being able to create [thin, high-quality keyboards]," he says.
Tactus' technology uses microfluidics to provide tactile
buttons that rise up from a touch screen's surface.
An outfit called Tactus is taking a different approach with microfluidics "buttons" -- essentially small pouches on the surface of the screen that fill with liquid, appearing only when you need to type. When they're not in use, they deflate, leaving a flat surface. Tactus CEO Craig Ciesla is hopeful that his company, like Synaptics, will have something ready for the market by the middle of next year.
Yet even as it looks toward the future, one of Tactus' core technologies is rooted in the past. Ciesla points out that microfluidics has been around for a couple decades in industries like biotech and computer printers. "We're just redeploying it in a novel and unique way," he notes.
Will wraparound keyboards like this Twitch concept
design provide a comfortable way to
type on mobile devices?
Moving in a whole different direction, a company called Twitch Technologies is developing add-on products such as a pair of one-handed keyboards that wrap around the left and right edges of tablets. Your fingers type on the back of the device and your thumbs on the front, and you use finger combinations to type letters -- for example, depressing your left pinky and right thumb might get you an A—rather than one key per letter for the QWERTY layout. (Don't hunt for Twitch's keyboards in stores yet; they're still in the concept stage.)
Reinventing the layout of the keyboard is hard for us to imagine, but even one-handed keyboards with no letters on them have roots in the past. When inventor Doug Englebart gave "The Mother of All Demos"—introducing myriad computing technologies we still use today, like the mouse and videoconferencing—he demoed a five-finger chorded keyboard that produced letters with different finger combinations. That was in 1968.
The catch, as Microsoft's Buxton points out, is that when you implement a new keyboard, everyone has to learn to type again. But it may just be worth it.
Caleb Garling is a staff writer for the San Francisco Chronicle covering technology and business. He used to be on the staff of Wired, covering enterprise technology and culture. He has caught a trout barehanded only twice in his life.

Source: pcworld.com












Friday, November 2, 2012

Cell-site outages fall to 19 percent in area hit by Sandy, FCC says


About 19 percent of cell sites in the area hardest hit by Hurricane Sandy were still out of service on Thursday as recovery was slowed by other network failures and power shortages, according to the U.S. Federal Communications Commission.
By 10 a.m. EDT Thursday, the outages had declined from about 22 percent of all cell sites in the region a day earlier, the FCC said in a statement on Thursday afternoon. That was an average across the area most affected by the storm, stretching across parts of 10 states. In addition, cable TV and cable Internet outages had been reduced to about 12 percent to 14 percent, the agency said.
"Overall, we're seeing both continued improvement in communications networks and also that much work remains to be done to restore service fully," FCC Chairman Julius Genachowski said in the statement. As a key part of the recovery effort, the agency is working with federal, state and local authorities to help get fuel to generators, he said.

Service gaps in hard-hit areas

There was steady improvement in the wired and wireless communications networks across the storm area, but restoration of service in the areas hardest hit, such as New York and New Jersey, has been more difficult, said David Turetsky, head of the FCC's Public Safety & Homeland Security Bureau. Some sites that could otherwise have come back online were held back by failures elsewhere in the communications infrastructure, he said.
The FCC said its Disaster Information Reporting System (DIRS) remained active, and the agency was still collecting data from carriers about the effects of the storm.
Emergency 911 calls are being received throughout the storm-affected area, though in some cases they are being re-routed to other 911 centers or don't contain location information, the FCC said.
On Thursday, the major wired and wireless carriers continued to bring facilities back up and deployed portable cell sites, some of which offered free device charging for people who had lost power.
T-Mobile USA reported that its network had been 85 percent restored in New York City and 80 percent restored on Staten Island. Verizon Communications said it had restored backup power to four critical facilities in lower Manhattan and one on Long Island that had suffered from flooding on Monday night. Those included the company's Manhattan headquarters.
Though Sandy had been downgraded from hurricane status before it reached land on Monday, it devastated a wide swath of the East Coast from North Carolina to Canada, stretching west to Michigan. The worst damage was in New York City and northern New Jersey.
At a press conference on Thursday, New York City Mayor Michael Bloomberg said AT&T was deploying portable cell sites near emergency assistance centers that have been set up around the city to help residents and distribute food. The trucks use satellite to connect to the rest of AT&T’s network and the Internet. Visitors can charge their mobile devices at the portable cells.
Verizon Wireless has also deployed cell sites on wheels where needed throughout the Northeast. The company has set up Wireless Emergency Communications Centers (WECCs) at Monmouth University in New Jersey and at two sites in Toms River, New Jersey, plus “stores on wheels” in Sea Girt and Howell, New Jersey. It’s offering device charging and free domestic calls to local residents at the WECCs and all its open store locations.
Meanwhile, Verizon Enterprise Solutions said cloud and data centers operated by Verizon and Verizon Terremark have remained operational and their services were unaffected.
Late on Thursday, Sprint Nextel said its network had been fully restored in Washington, D.C., Maryland, Virginia, Delaware, Maine, Vermont, Ohio and Kentucky. After “significant progress,” the network was more than 90 percent operational in Massachusetts, New Hampshire, Pennsylvania and Rhode Island.
But service was harder to come by in New York, New Jersey and Connecticut, where the network was only 80 percent restored—only 75 percent in New York City.

Source: techhive.com

Monday, October 29, 2012

Can Skype really take the place of a face-to-face meeting?

It's the ultimate business hack: Instead of traveling to meet with a client, a design team, or anyone else you need to see face-to-face, you stay put and set up a video call instead. The technology is there—Skype, WebEx, etc.—and it can save you considerable time and money.
Indeed, think of what's involved in the typical business trip. Airfare. Hotel. Rental cars and/or taxis. Lunches. Dinners. And at least a day of your time, if not the better part of a week.
A videoconference, on the other hand, requires only a few minutes of setup, some equipment you may already have, and the time it takes for the actual meeting. Like I said: the ultimate business hack.
But how does this fly in the real world? Can Skype and similar services really take the place of a face-to-face sitdown at a conference table or business dinner?
I think it depends in part on the business. Cambridge, Mass.-based Plan B Salon, for example, offers 15-minute video consultations, giving customers a chance to learn about their options and ask questions before actually traveling to the salon.
In Lafayette, Ind., therapist Buck Black offers counseling sessions via Skype, thus allowing him to have a customer base that spans the country instead of just the city.
Now, those are fairly specialized businesses. Would that kind of videoconferencing work for your enterprise? I'd like to hear your thoughts.
In the meantime, I think one reason videoconferencing hasn't really caught on in boardrooms is hardware limitations: You can't comfortably gather a group of people around a laptop screen.
Enter TelyHD Business Edition, a dedicated Skype Webcam that plugs into an HDTV instead of a computer. That not only gives you a much larger screen for your meetings, but also affords much better video and audio than you get from the typical laptop Webcam.
The TelyHD works with your existing Skype account and supports multi-party calling (up to six locations simultaneously). There's also a companion Windows app that allows for screen and document sharing. Price tag: $499. Steep for a Webcam, yes, but less than I spent on airfare alone for my last business trip. Food for thought.
I've tested the consumer version of the TelyHD, and it works quite well. My immediate thought: This belongs in boardrooms! It'll be interesting to see if it catches on there, as it's definitely a more affordable solution than most business videoconferencing systems.
Source: pcworld.com


IBM's next-gen chips may swap silicon for carbon nanotubes


IBM has hit a milestone in its quest to come up with a successor to silicon computer chips.
The company said Sunday its research into semiconductors based on carbon nanotubes,or CNTs, has yielded a new method to accurately place them on wafers in large numbers. The technology is viewed as one way to keep shrinking chip sizes once current silicon-based technology hits its limit.
IBM said it has developed a way to place over 10,000 transistors made from CNTs on a single chip, two magnitudes higher than previously possible. While still far below the density of commercial silicon-based chips—current models in desktop computers can have over a billion transistors—the company hailed it as a breakthrough on the path to using the technology in real-world computing.
The company made the announcement to mark the publication of an article detailing the research in the journal Nature Nanotechnology.
Intel's latest processors are built using silicon transistors with 22-nanometer technology, and simpler NAND flash storage chips have been demonstrated using "1X" technology somewhere below that, but modern manufacturing is nearing its physical limits. Intel has predicted it will produce chips using sizes in the single digits within the next decade.

Guided by Moore's Law

An IBM scientist shows different solutions
containing carbon nanotubes.
The march toward ever-smaller transistors has produced chips that use less power and can run faster, but can also be made at lower cost, as more can be crammed onto a single wafer. The increasing number of transistors on a given amount of silicon was famously predicted by Gordon Moore, co-founder of Intel, who predicted they would double steadily over time.
Carbon nanotubes, tube-shaped carbon molecules, can also be used as transistors in circuits, and at dimensions of less than 10 nanometers. They are smaller and can potentially 
carry higher currents than silicon, but are difficult to manipulate at large densities.
Unlike traditional chips, in which silicon transistors are etched into circuit patterns, making chips using CNTs involves placing them onto a wafer with high accuracy. Semiconducting CNTs also come mixed with metallic CNTs that can produce faulty circuits, and must be separated before they are used.
IBM said its latest method solves both issues. The company's researchers mix CNTs into liquid solutions that is then used to soak specially prepared substrates, with chemical "trenches" to which the CNTs bond in the correct alignment needed for electrical circuits. The method also eliminates the non-conducting metallic CNTs.
The company said the breakthrough will not yet lead to commercial nano-transistors, but is an important step along the way.
Before they can challenge silicon, however, they must also pass an often-overlooked part of Moore's law—affordability. His law applies to "complexity for minimum component costs," or what consumers are likely to see in the market.
Source: pcworld.com

Online tools to use to track Hurricane Sandy's power


Hurricane Sandy is proving to be a storm not to be trifled with. If you need to stay abreast of the latest developments, there are plenty of useful resources you can find online.
Whether you're within Sandy's wide path or miles away, here are some tips to use if you can get online, and sometimes even if you can't, to stay informed.

Live audio and video

An image from The Weather Channel's live
broadcast Monday morning.
If you subscribe or even just have satellite radio hardware, satellite radio provider Sirius XM has replaced its preview channels with live audio of The Weather Channel’s coverage of Sandy.
You can find this on channel 1 on XM radios, and channel 184 on Sirius. As this channel is available whether you’re subscribed or not, any XM or Sirius radio can listen in, even if your subscription isn't active.
The Weather Channel is streaming live video of its on-air broadcast for the duration of the storm. The video stream is compatible with mobile devices including iOS and Android. This could be useful for folks who have already lost power, but still get connectivity bars on their phones.
Various local television outlets are simulcasting their coverage online: Philadelphia’s ABC affiliate WPVI offers live video, and NBC affiliate WCAU is doing the same. In New York, expected to be one of the hardest hit areas by Sandy’s storm surge, ABC affiliate WABC also is broadcasting live online.

Storm danger

The information upticks are in response to the power of Hurricane Sandy, the so-called megastorm that is more than 900 miles wide. Sandy's path will take it across some of the most populous areas of the country and put more than 50 million people at risk.
New York City Mayor Michael Bloomberg urges residents to evacuate low-lying areas of the five boroughs, the Weather Channel reports the hurricane was still intensifying as of early Monday morning, and the storm is even impacting the Great Lakes to the west including the Chicago area.
Tech companies including Facebook and Google postponed product events scheduled for New York City. AT&TSprintT-Mobile, and Verizon all say they have emergency measures in place to keep their networks running and restore service as soon as possible during any outages.

On the Web

Anyone looking for raw hurricane data can turn to the National Hurricane Centerfor information. Links to current information on Sandy appear right on the front page, including current intensity, forecast tracks, and radar and satellite data. The Weather Underground features some key storm data as well.
Google's Crisis Map displaying Hurricane Sandy
data on Monday morning
Beyond its live online broadcast, The Weather Channel has loads of coverage worth reading, including a Top 5 things you need to know post and a breakdown of expected impacts on major cities on the East Coast.
The site's front page also includes current Hurricane Sandy data such as the storm's category rating, location, direction, wind speeds, and pressure.
The Wall Street Journal and The New York Times dropped their pay walls for the duration of Hurricane Sandy so you can views their coverage. Other New York newspaper sites worth checking out online include the New York Post and theNew York Daily News.
Google activated its crisis map Web application for Sandy, which overlays forecast information and radar data over information such as weather observations, and locations of storm shelters. If Google's Web app includes too much data to take in all at once, you can turn different layers of data on and off on the right-hand side of the page.
If you want to really appreciate Sandy’s mammoth size, head over to NASA’sEarth Observatory page. Scroll down and you’ll find links to about a dozen high-resolution images of Sandy; more hurricane images may be added over the coming days.

Apps

If you want to get an Android or iOS smartphone or tablet app for dedicated Sandy information there are numerous options, but your best bets are probably apps from the Weather Channel (AndroidiOS), the Weather Underground (AndroidiOS), FEMA—the Federal Emergency management Agency—(AndroidiOS), and Hurricane by the American Red Cross (AndroidiOS).
The Red Cross
hurricane app
Don't forget that many smartphones and MP3 players come equipped with an FM radio receiver, which could come in handy for getting vital reports, especially immediately after the storm passes. If you haven't done so yet, test your device's FM radio app to see if it's working properly.
Most FM radio apps require headphones to act as an antenna, so make sure you keep a pair in your pocket after you've tested that the app works.

Social media proves its worth

The Weather Channel has a dedicated hurricane twitter feed, @twc_hurricane, that includes news and tweets from users in the storm's path such as this Instagram shot from Atlantic City, New Jersey. The Reuters news wire, @reuters, has a lot of Sandy-themed information, and The Weather Channel's Stephanie Abrams, @StephanieAbrams, publishs key storm data.
The Federal Emergency Management
Agency (FEMA) on Twitter
For tips and shelter information, you should also bookmark accounts such as @fema from the Federal Emergency Management Agency and the American Red Cross, @RedCross. New Yorkers should also keep tabs on@311NYC, the city's Twitter equivalent of the 311 government services phone number.
Finally, if you want raw citizen journalism check out the #sandy hashtag, as well as local hashtags like #nyc#philly, or any other city in the storm's path.

Facebook, Google+, and Instagram

Facebook's Global Disaster Relief page is a good resource for Hurricane Sandy information, as is the Government on Facebook page. Google+ users can bookmark the Hurricane Sandy 2012 stream (warning, this stream is moving fast) for tips and information being posted from across the search giant's social network.
On Instagram's mobile apps, you can also use the #sandy hashtag to find photos of the storm. You cannot search Instagram directly from the web, but services such as Statigram allow you to search through the thousands of pictures that have been posted to the photo-sharing service.
For more go-to tech tips in disaster situations check out:
Have suggestions or favorite apps and sites to keep up to date on Sandy? Please share them with everyone in the comments.

Source: techhive.com




Friday, October 19, 2012

A123 Introduces New Battery Technology Amid Recent Financial Troubles


You have to hand it to A123 Systems. After experiencing an embarrassing and costly manufacturing snafu this past March that required a recall costing the company US $55 million, then having to report a first quarter loss of $125 million in May, one might have expected the company to retrench in order to sort out how its revenues last quarter had decreased by 40 percent from the same quarter in 2011. But A123 didn’t do that. Instead the company announced an update to its Nanophosphate® lithium iron phosphate battery technology.
This would be an understandable approach to managing dwindling fortunes, especially if the company had suddenly devised a Li-ion battery that could compete head to head with internal combustion engines. As U.S. Energy Secretary Steven Chu once declared, "a rechargeable battery that can last for 5000 deep discharges, [and offer] 6 or 7 times as much storage capacity (3.6 Mj/kg = 1,000 Wh) at [one-third of today's costs] will be competitive with internal combustion engines (400-500 mile range).”  A press release proclaiming that would certainly change the conversation and forever alter the company's market fortunes.
However, we didn’t get that. Instead, we got A123's same battery technology, but updated so that it can operate at extreme temperatures. "We believe Nanophosphate EXT is a game-changing breakthrough that overcomes one of the key limitations of lead acid, standard lithium-ion, and other advanced batteries. By delivering high power, energy, and cycle life capabilities over a wider temperature range, we believe Nanophosphate EXT can reduce or even eliminate the need for costly thermal management systems, which we expect will dramatically enhance the business case for deploying A123's lithium ion battery solutions for a significant number of applications," said David Vieau, the company's CEO, in a press release.
Will removing or just reducing the need for cooling systems be a game changer for both the transportation and telecommunications markets? Both applications will certainly benefit, but I imagine it will make more of a difference in the telecommunications market, where it will be used to power cell tower sites built off-grid or in regions with unstable power. After all, it’s hard to see how this brings Li-ion battery technology any closer to propelling a car for 800 kilometers on a single charge while lowering the price of the battery system by a factor of three. I am not sure this changes the conversation, never mind the game.

Thursday, October 18, 2012

How it works: The technology of touch screens

From single-touch to multitouch and why all displays are not equal.

Here's a question: What is a technology that you can't see, but is essential to smartphones, tablets and other mobile devices -- and is estimated to generate $16 billion in revenues this year (according to DisplaySearch)? The answer is multitouch touch screens -- which have sparked the explosive growth of the mobile device market.

It was not so long ago that we would tap away on a PalmPilot with a tiny stylus, or exercise our thumbs on a BlackBerry micro-keyboard. Then, in January 2007, along came the Apple iPhone, and everything changed. Suddenly, people were wiping their fingers across screens, pinching images and performing other maneuvers that had not previously been part of thesmartphone interface.

Now we not only take touch input for granted, we expect to be able to use multitouch (using more than one finger on the screen at a time) and gestures as well. What made this touch screen revolution possible, and where is it likely to take us?

Many paths to touch

To begin with, not all touch is created equal. There are many different touch technologies available to design engineers.

According to touch industry expert Geoff Walker of Walker Mobile, there are 18 distinctly different touch technologies available. Some rely on visible or infrared light; some use sound waves and some use force sensors. They all have individual combinations of advantages and disadvantages, including size, accuracy, reliability, durability, number of touches sensed and -- of course -- cost.

As it turns out, two of these technologies dominate the market for transparent touch technology applied to display screens in mobile devices. And the two approaches have very distinct differences. One requires moving parts, while the other is solid state. One relies on electrical resistance to sense touches, while the other relies on electrical capacitance. One is analog and the other is digital. (Analog approaches measure a change in the value of a signal, such as the voltage, while digital technologies rely on the binary choice between the presence and absence of a signal.) Their respective advantages and disadvantages present clearly different experiences to end users.

Resistive touch

The traditional touch screen technology is analog resistive. Electrical resistance refers to how easily electricity can pass through a material. These panels work by detecting how much the resistance to current changes when a point is touched.



This process is accomplished by having two separate layers. Typically, the bottom layer is made of glass and the top layer is a plastic film. When you push down on the film, it makes contact with the glass and completes a circuit.

The glass and plastic film are each covered with a grid of electrical conductors. These can be fine metal wires, but more often they are made of a thin film of transparent conductor material. In most cases, this material is indium tin oxide (ITO). The electrodes on the two layers run at right angles to each other: parallel conductors run in one direction on the glass sheet and at right angles to those on the plastic film.

When you press down on the touch screen, contact is made between the grid on the glass and the grid on the film. The voltage of the circuit is measured, and the X and Y coordinates of the touch position is calculated based on the amount of resistance at the point of contact.

This analog voltage is processed by analog-to-digital converters (ADC) to create a digital signal that the device's controller can use as an input signal from the user.