Hacked Kinect taught to work as multitouch interface

11 11 2010
 

We gotta say, the last time we were this excited about hardware hacking For The Greater Good was when people started using the Wiimote for all sorts of awesome projects. Kinect is naturally a lot more complicated, but there’s also a lot of potential here, and we can’t wait to see what people come up with. Florian Echtler took that open source driver and hooked the Kinect into his own multitouch UI “TISCH” software library (which actually supports the Wiimote as an input already, funny enough). The result is a bit of MS Surface-style multitouch picture shuffling and zooming, but it uses full body tracking instead of touchscreen input, of course. The self-effacing Florian had this to say in the video description: “I thought I’d get the mandatory picture-browsing stuff done so it’s out of the way and everybody can focus on more interesting things.” You’re still a hero in our book, man. Always a hero.

Feeling left out on all these Kinect shenanigans because you’re rocking a Mac? Well, libfreenect has also now been ported over to OS X by Theo Watson (who sounds unenthused about his accomplishment in the video embedded after the break). Also: once you’re done admiring your IR-rendered visage on your shiny Apple-built hardware, scrounge yourself up a working Linux box. All the cool people are doing it





Google’s driverless car drives interest in driverless cars

18 10 2010

Self-driving cars are hardly new. We’ve seen dozens of automatic vehicles over the years, many of which have seen advances driven (so to speak) by various DARPA challenges. But now that Google’s involved — whoa! — the mainstream media is suddenly whipped into a frenzy of hyperbolic proclamations about the future. Still, it is fascinating stuff to watch. So click on through if you like having your tech salad tossed with a side of smarmy TV-news voiceover. Trust us, it’s delicious.

http://abcnews.go.com/GMA/video/test-driving-google-car-11857670?&clipId=11857670&cid=embedded

 

sourceABC News





HP’s Photosmart eStation Android tablet hands-on

7 10 2010

So here it is, after months of details coming to light an inkdrop at a time, the HP eStation all-in-one printing solution. But we’re not gonna dwell but on half of that: the 7-inch tablet skinned out by Yahoo and powered by Android 2.1. As we expected, though, the Google experience is decidedly less that you’re accustomed to: search is Yahoo only, and our attempt to find an alternate method was met with a barebones settings menu (search via the browser page still works). Additionally, there is no access to Android Market, relegating your customization instead to HP’s print-heavy app store — sorry, no games, as that’s not what the company wants to focus on here, according to the rep. That also means no native Gmail, much to our dismay. What Yahoo has provided is a suite of apps and widgets that actually work well in their simplicity, from weather to stocks and search.

We were reminded at numerous points that this is a prototype build, and for good reason — the responsiveness was questionably slow, especially in the browser. That said, the Nook store and e-reading app was as fluid as you’d ever need. WiFi is equipped on both the tablet and the printer for cloud-based connectivity on the go. Battery life is measured at four to six hours, and Android 2.2 is expected by holiday still sans Market, but beyond Flash (and at this point we question its performance on this hardware), there’s probably not a lot of value-add in the update. Expect this AIO to be shipping the in the next few weeks.





MIT Medical Lab Mirror tells your pulse with a webcam

7 10 2010
MIT Medical Lab Mirror tells your pulse with a webcam

Mirror mirror on the wall, who has the highest arterial palpation of them all? If you went to MIT you might be able to answer that question thanks to the work of grad student Ming-Zher Poh, who has found a way to tell your pulse with just a simple webcam and some software. By looking at minute changes in the brightness of the face, the system can find the beating of your heart even at a low resolution, comparable to the results of a traditional FDA-approved pulse monitor. Right now the mirror above is just a proof of concept, but the idea is that the hospital beds or surgery rooms of tomorrow might be able to monitor a patient’s pulse without requiring any wires or physical contact, encouraging news for anyone who has ever tried to sleep whilst wearing a heart monitor.





Berkeley Bionics reveals eLEGS exoskeleton, aims to help paraplegics walk in 2011

7 10 2010

Wondering where you’ve heard of Berkeley Bionics before? These are the same whiz-kids who produced the HULC exoskeleton in mid-2008, and now they’re back with a far more ambitious effort. Announced just moments ago in San Francisco, the eLEGS exoskeleton is a bionic device engineered to help paraplegics stand up and walk on their own. It’s hailed as a “wearable, artificially intelligent, bionic device,” and it’s expected to help out within the hospital, at home and elsewhere in this wild, wild place we call Earth. Initially, the device will be offered to rehabilitation centers for use under medical supervision, and can be adjusted to fit most people between 5’2″ and 6’4″ (and weighing 220 pounds or less) in a matter of minutes. We’re told that the device provides “unprecedented knee flexion,” and it’s also fairly quiet in operation; under ideal circumstances, speeds of up to 2MPH can be attained, and it employs a gesture-based human-machine interface that relies on legions of sensors to determine a user’s intentions and act accordingly. Clinical trials are going on as we speak about to begin, and there’s a limited release planned for the second half of 2011. We’re still waiting to hear back on a price, so keep it locked for more as we get it live from the event.

Update: We just got to see the eLEGS walk across stage, and you’ll find a gallery full of close-up pics immediately below. We also spoke to Berkeley Bionics CEO Eythor Bender, who detailed the system a bit more — it’s presently made of steel and carbon fiber with lithium-ion battery packs, weighs 45 pounds, and has enough juice to run for six hours of continuous walking. While he wouldn’t give us an exact price, he said they’re shooting for $100,000, and will be “very competitive” with other devices on the market. Following clinical trials, the exoskeleton will be available to select medical centers in July or August, though Bender also said the company’s also working on a streamlined commercial version for all-day use, tentatively slated for 2013.





NFL FanVision review — and behind the scenes

28 09 2010

We’ve been wanting to try out Kangaroo TV’s FanVision in-stadium video handheld ever since we first heard it was coming to 10 NFL teams (and the Michigan Wolverines) this year, and we finally got our chance last night during the Packers / Bears game here in Chicago. The system is actually super interesting, as it’s the only large-scale DVB-T operation we’ve seen in the States; FanVision sets up a private network for each team and sports event they work with. At Soldier Field, that means there are two transmitters at either end of the field for people in the stadium, and another located in the scoreboard so the devices work while people are tailgating in the parking lot. The system has about 8Mbps of bandwidth, so each of the 10 channels on the device streams at about 800Kbps, a quality level the produces some blockiness but is perfectly watchable on the FanVision handheld’s 4.3-inch QVGA screen.

What’s on the 10 FanVision channels varies depending on the team, stadium, and broadcast setup of each game; Monday Night Football games have a SkyCam channel, for instance, while games at Dolphins Stadium have a dedicated cheerleader channel. There are also three other game broadcasts and the NFL RedZone channel, so you can keep track of everything else going on in the league. The device buffers three of the camera channels locally for replays — an operator in the control room tags the beginning and end of each play and specifies which camera has the best angle on the play. It’s a slick trick that lets you dial up a replay instantly, since it’s stored right there on the device. We just wish you could store more than the last play — once the next play starts the old replay is gone forever.

That’s not the end of the world, though — another operator spends the entire game building a continuously-looping highlight channel, so you can catch up on what’s going on at any point in the game. You can also watch the scoreboard feed, as well as the network broadcast of the game itself — and since FanVision is retransmitting the feed directly from the broadcast truck, it’s actually a few seconds ahead of watching it on TV. (For Monday Night Football FanVision was around six or seven seconds ahead of the TVs in the stadium, since ESPN delays the broadcast so they can bleep out any inadvertent profanity the mics pick up.)

Of course, you don’t go to the stadium to watch ESPN in real time, you go to watch the game — and we’re happy to report FanVision makes watching the live game much more interesting as well. For starters, you’re not stuck waiting for the scoreboard operator when it comes to replays — anyone with a FanVision last night knew right away that the Packers wasted a challenge on that backbreaker James Jones fumble. (Maybe they should give these to coaches!) You’re also not stuck waiting for stats, since both team and individual numbers are all right there — including some deeper info like offensive success running and passing to different parts of the field. (Fun fact: all the stats in the NFL are delivered by a central server called NFL GSIS, pronounced “Jesus.”) It’s also kind of cool that the device buzzes and lights up red when either team is in the red zone — you should probably have noticed that anyway, but, well, red LEDs. You know how it is.

We’ve also found that cell service in a stadium is usually atrocious, so being able to check on other scores, other games, and even load up fantasy stats on the FanVision is incredibly useful — although you can’t load up a fantasy defense, for some reason. (FanVision says they’re working on it, but there’s no timeline.) All of these features work flawlessly in practice — the handheld is smooth and responsive, and tuning between channels and calling up data is as fast as we’ve ever seen on a mobile video device. Battery life is pegged at six hours, more than enough for an NFL game; at golf events FanVision rents the units with a spare battery and people use ’em all day.

Speaking of rentals, that’s actually the biggest problem with the FanVision in the NFL — you can only buy the units for $199 right now. If you have season tickets or you’re otherwise regularly attending games it’s a no-brainer; you’ll definitely get your money’s worth out of a FanVision. If you’re only going to a game here or there, though, it’s kind of a lot of money, especially since it doesn’t do a damn thing outside of the stadium. FanVision says it’s working on rentals for the NFL, but nothing’s happening yet — right now the only way to get a FanVision at a football game is to either be lucky enough to score a free promotional unit or pony up the two Bennys.

We’ll be honest: we weren’t expecting FanVision to be nearly as good as it is — single-purpose embedded gadgets are usually a laggy mess of tortured UI design and poor performance. Not so with the FanVision, which is the second revision to a third generation product that’s been used for the PGA and NASCAR in the past — it’s quick, it’s intuitive, and it’s actually fun to use. Once FanVision works out a rental model, these little screens will be totally ubiquitous at NFL games — who wouldn’t pay an extra $20 or $30 for this much of an improvement to the live experience? The other hurdle is getting all the teams to sign on, and that’s slow going — Jets fans will be able to use FanVision during games at the New Meadowlands, while the Giants haven’t gotten on board. But all of that’s just a matter of time — as is the Packers’ eventual murderous revenge against the Bears.





Ubuntu prototype uses face recognition to intelligently move UI elements

21 09 2010

24diggsdigg Not that we haven’t seen mock-ups before for systems using webcams to intelligently move user interface elements, but it’s another thing entirely for a company to make a public proclamation that it’s tinkering with implementing something of the sort into a future build of its OS. Over at the Canonical design blog, one Christian Giordano has revealed that the company is in the early stages of creating new ways to interact with Ubuntu, primarily by using proximity and orientation sensors in order to have one’s PC react based on how they’re sitting, where they’re sitting and where their eyes / head are at. For instance — once a user fires up a video and leans back, said video would automatically go into fullscreen mode. Similarly, if a user walked away to grab some coffee and a notification appeared, that notification would be displayed at fullscreen so that he / she could read it from faraway. There’s no mention just yet on when the company plans to actually bring these ideas to end-users, but the video embedded after the break makes us long for “sooner” rather than “later.”

sourceCanonical