If you’ve ever watched an episode of Law & Order or CSI, you have probably noticed some outlandish forensic work involving photographs being used to aid the show’s team of detectives in furthering their investigation. Sometimes it’s clearing up pixelations or using a minute reflection in a window to read some perp’s name tag, which are all things that’s mostly cooked up in fantasy.
How would you like your future smartphone screen to be completely glare-free, water-repellent, and self-cleaning? Researchers at MIT published a paper describing how they selectively removed parts of the glass to create microscopic cones, which apparently gives the glass the ability to resist fogging and glare. The MIT news site states the following:
“The new ‘multifunctional’ glass, based on surface nanotextures that produce an array of conical features, is self-cleaning and resists fogging and glare, the researchers say. Ultimately, they hope it can be made using an inexpensive manufacturing process that could be applied to optical devices, the screens of smartphones and televisions, solar panels, car windshields and even windows in buildings.”
The lack of glare or fog would make the glass nearly invisible. Also, water would literally just bead and bounce right off, taking any dust along with it, making it super easy to keep dust-free. Check out the video of water droplets rolling off the glass after the break.
MIT’s Media Lab is at it again. This time they demoed NewsFlash which uses high-frequency red and green light to transmit data to a built-in camera on a receiving device. In the demo they used an iPad to transmit to a Samsung Epic 4G. The concept is similar to a QR code, but no 2D square. Instead the generated lights are invisible to the human eye and a little more graceful. This is only at the concept stage so don’t expect to see this soon, but one has to wonder if NFC would be a better alternative. Hit the break for the demo video from Engadget.
Google Labs was responsible for some of the apps we can’t live without on our devices: Google Search, Google Goggles and my personal favorite, App Inventor. Unfortunately the app was phased out, along with Google Labs, but not before the source code was made open for all. Of course, since App Inventor was a pivotal educational tool, the Massachusetts Institute of Technology picked it up right where Google left it.
I bring some good news today, as the beta version of the application is available to anyone with a Google account! That’s right, anyone can pick up and go create their own apps in a versatile, innovative and naturally intuitive environment. Anyone who’s ever wanted to give app development a shot but didn’t quite understand the jargon involved with software development should head over to the MIT’s App Inventor website (source link below) and give it a go.
Inventors and app developers are about to get an early Christmas present courtesy of MIT. The WYSIWYG app building tool is back thanks to MIT and with this, received a donation from none other than Google to create the MIT Center for Mobile Learning. One of the first items on its agenda was to resurrect the infamous App Inventor— you know the landmark app creator tool that was sadly shut down by Google. MIT has followed through with the resurrection of the app and has released the initial source code to the masses. In addition, they with occasionally update the source code to match what it is doing in-house.
Don’t expect to hear about too much detail about the source code for now. MIT does not have much documentation of the source code at this time as they are focusing their resources on getting a large-scale public server up by April. However, there’s an ever-growing community already started which likely includes information for those wanting to get up and running today.
Interested folks ready to jump in and try out the source code can go ahead and register at the MIT link below. If you don’t hear an immediate resposne from MIT, know that it’s likely because they already have a long list of interested users and perhaps don’t want to overload the service— especially because it’s in its infancy. Is anyone else excited about the idea of users being able to create the next big app again? Sound off in the Comments section and let us know what you think.
[via MIT Blog by Android And Me]
In recent news, Intel is looking to get a foot in the door when it comes to manufacturing smartphones and tablets. It appears as though they want the world to know that they can do more than just produce chips with the U.K. company’s ARM architecture. Intel upped their game last week when MIT’s Technology Review tested a smartphone and tablet prototype equipped with Intel’s latest chip set called the Medfield. Medfield ran on devices using Google’s Android OS on board and showed some impressive results.
“We expect products based on these to be announced in the first half of 2012,” says Stephen Smith, vice president of Intel’s architecture group.
These devices are being dubbed “reference designs” and are being shipped out to persuade handset manufactures to build devices that will surround the new Intel hardware.
“They can use as much or as little of the reference design as they like…..“
The quote above was also stated by Smith as he hinted that we could see some serious hardware housing the new tech in January at CES. The Medfield design is directly in line with Intel’s “Atom” series chip set and promises to aim towards better battery consumption, something Intel is not really noted for. What previously took several cores to accomplish will now be possible with only one core chip which will combine an array of functionality. “This is our first offering that’s truly a single chip,” says Smith. The “all-in-one design” aka system on a chip (SoC) has become the standard feature when it comes to ARM based CPU’s in today’s smartphones.
The prototype which Technology Review worked with had similar dimensions to the iPhone 4S however, was noticeably lighter since there is more plastic and not as much glass or metal. The device was sporting Android 2.3 (Gingerbread). In addition, it was said that the device was “powerful and pleasing” to use and was easily in a class and on par with the latest Apple device. The handset could play Blue-ray quality video, stream to TV sets and browse the web with smooth and fast scrolling. All in all we’re sort of looking forward to seeing what Intel can come up with. On to CES!
Last week we heard the news that Google’s Android App Inventor was to be shut down. Speculation is that this is due in part to newly appointed CEO Larry Page’s desire to focus the company’s efforts. Of note, App Inventor isn’t the only product affected by this. The entirety of Google Labs is also “being phased out”. This is a significant loss for Android, as the Labs were directly responsible for mobile products we love like Google Goggles, Gesture Search, and Sky Map. What innovative new products might we now miss out on? Luckily the products mentioned above will continue to exist, but other Android Lab projects like BreadCrumb won’t be so lucky. It seemed at first as if App Inventor was also on that “do not recuscitate” list. Thankfully, however, Google announced that they would open source the project to whoever was willing to pick it up. Enter MIT. MIT has come up with a new Center for Mobile Learning to be housed in the famed MIT Media Lab. There, an open-sourced App Inventor will begin again in the hands of its original creator Hal Abelson as well as fellow MIT professors Eric Klopfer and Mitchel Resnick. By this partnership, App Inventor will likely be re-released under a dual Google/MIT license. Read more
So MIT already has a fair amount of geek points stored up, right? I mean…they’re MIT, after all. Well, you can add at least another +10 to their geek cred with their latest creation, the Junkyard Jumbotron. Designed by Rick Borovoy, Ph.D. and Brian Knep, the web-based project allows you to set up any number of devices into a conglomeration of your choosing. You then point all of the devices web browsers to the Junkyard Jumbotron website, e-mail a picture to the project, and voila! (Somewhat) instant techno-mosaic. Feel free to try it yourself with the original source below and check out the video at the source link.
With all the emphasis placed on the entertainment and social media aspects of smartphones, academic functions are often overlooked. This is a shame, really, as there are some pretty incredible applications being developed that have very practical use in the fields of math and science.
Researchers at MIT have developed an app framework which allows for scientists to design a custom Android application for carrying out calculations based on data generated by the Texas Computing Center’s Ranger supercomputer. The Ranger does the bulk of the work, but the applications allows for quick, on the fly manipulation of the data for real time use in the field.
Check out the video below for a demonstration of the app, and be sure to check out the source link below for additional information and a link to the projects SourceForge page.
Have bad vision? So bad, you have trouble seeing your Android phone? Well, prepare to have the problem fixed… well, ok, not right now, but developers up at MIT are working on it. They have developed an app on the Android platform that lets the user manipulate and change sets of visuals (lines and dots) while looking through a small “eye”, made only from the parts of a holographic barcode reader, and only costs about $2 to create. Through a series of changes to the visuals on the phone, the user will be able to bring everything into focus for their eyesight, and the app will spit out a prescription based on the results.
Check out this video, showing some of the specifics: