Synopsis: Domenii: Ict: Ict generale: Data:


newsoffice 00003.txt

pixels are illuminated by a white LED backlight that passes through blue, red, and green filters to produce the colors on the screen.

With more light shining through the pixels, LCD TVS equipped with Color IQ produce 100 percent of the color gamut,


newsoffice 00011.txt

which uses sensor identification badges and analytics tools to track behavioral data on employees providing insights that can increase productivity.

and using that data to build a baseball team. But what if I could say Here s how you need to talk to customers here s how people need to collaborate with each other

Individuals can use that data to boost performance and a company can use that to help set up an environment where everybody s going to succeed.

Readers placed around an office collect the data and push it to the cloud. Individuals have access to their personal data via a Web dashboard

By combining this information with employee-performance data from surveys interviews and objective performance metrics Sociometric can pinpoint areas where management can build more productive offices in ways as surprising as providing larger lunch tables or moving coffee stations to increase interaction.

Accumulating more than 2000 hours of data and comparing that data with survey results they predicted with 60 percent accuracy that close-knit groups of workers who spoke frequently with one another were satisfied more

and got more work done more efficiently. They also found evidence of communication overload where high volumes of email due to lack of face-to-face interaction were causing some employees difficulty in concentrating


newsoffice 00012.txt

If you could turn the DNA inside a cell into a little memory device on its own


newsoffice 00025.txt

and analyzing vast quantities of data related to the diverse types of bacteria within the human body and their interactions with each other and the body s own cells and organs.

but our ability to translate this data into usable knowledge is lagging behind says Arup K. Chakraborty the Robert T. Haslam (1911) Professor of Chemical engineering Physics Chemistry and Biological engineering at MIT and director of the MIT Institute


newsoffice 00046.txt

Prepared to send sizeable chunks of data at any given time the amplifiers stay at maximum voltage eating away power more than any other smartphone component and about 75 percent of electricity consumption in base stations#and wasting

The AMO technology was a new transmitter architecture where algorithms could choose from different voltages needed to transmit data in each power amplifier

This could be done on the transmitting and receiving end of data transfers. This caught the eye of Astrom who had come to MIT after working in the mobile industry for 10 years looking for the next big thing.

but you could see how much data traffic would explode. Fleshing out a business plan from an

Future-proofing technologytoday Eta Devices major advantage is that its technology is able to handle ever-increasing data bandwidths.


newsoffice 00065.txt

along with several MIT graduate students, to test much smaller versions of the device in animals. he Deshpande funding was an absolutely critical element in getting the data necessary to raise capital for Taris,

Indeed, collecting clinical data is a major challenge in spinning biotechnology out of the lab, notes Cima,

Thanks to the data gathered from the study, Cima and Langer were able to launch Taris, with Lee as chief scientist,

collected the data to determine it was thought feasible, I it was something that could make a big impact.


newsoffice 00086.txt

and other packets of data between continents, all at the speed of light. A rip or tangle in any part of this network can significantly slow telecommunications around the world.

and data loss. e now have a set of design guidelines that allow you to tune certain parameters to achieve a particular pattern,


newsoffice 00093.txt

social media, data streams, and digital content. Pattern discovery and data visualization will be explored to reveal interaction patterns and shared interests in relevant social systems,

and data can be an effective catalyst for increasing accountability and transparency creating mutual visibility among institutions and individuals.""


newsoffice 00103.txt

The findings need to be validated with more data and it may be difficult to develop a reliable diagnostic based on this signature alone Vander Heiden says.


newsoffice 00144.txt

so it can collect all that data and display that info to a user, says Downey, Airware CEO,

and collects and displays data. Airware then pushes all data to the cloud, where it aggregated and analyzed,

and available to designated users. If a company decides to use a surveillance drone for crop management, for instance,


newsoffice 00150.txt

or wireless gloves to seamlessly scroll through and manipulate visual data on a wall-sized, panoramic screen.

Putting pixels in the room G-speak has its roots in a 1999 MIT Media Lab project co-invented by Underkoffler in Professor Hiroshi Ishii Tangible Media Group called uminous Room,

which enabled all surfaces to hold data that could be manipulated with gestures. t literally put pixels in the room with you,

and the data could interact with, and be controlled by, physical objects. They also assigned pixels three-dimensional coordinates.

Imagine, for example, if you sat down in a chair at a table, and tried to describe where the front,

and this much in front of me, among other things, Underkoffler explains. e started doing that with pixels.

nd the pixels surrounded the model, Underkoffler says. This provided three-dimensional spatial information, from which the program casted accurate, digital shadows from the models onto the table.

shared-pixel workspace has enormous value no matter what your business is. Today, Oblong is shooting for greater ubiquity of its technology.


newsoffice 00156.txt

Authoritatively answering that question requires analyzing huge volumes of data which hasn t been computationally feasible with traditional methods.

So the researchers also analyzed the data on the assumption that only trips starting within a minute of each other could be combined.

In analyzing taxi data for ride sharing opportunities Typically the approach that was taken was a variation of the so-called traveling-salesman problem Santi explains.

Next the algorithm represents the shareability of all 150 million trips in the database as a graph.

if it ran on a server used to coordinate data from cellphones running a taxi-sharing app.

whereas the GPS data indicated that on average about 300 new taxi trips were initiated in New york every minute.

Finally an online application designed by Szell Hubcab allows people to explore the taxi data themselves using a map of New york as an interface.

David Mahfouda the CEO of the car-and taxi-hailing company Bandwagon whose business model is built specifically around ride sharing says that his company hired analysts to examine the same data set that Santi

Making the entire data set available on a queryable basis does seem like a significantly larger lift.


newsoffice 00159.txt

Algorithms distinguish the pills by matching them against a database of nearly all pills in circulation.

If a pill isn t in Medeye s database because it s new for instance the system alerts the nurse who adds the information into the software for next time.

That s when we realized what a change it would be for a hospital to collect data


newsoffice 00169.txt

that uses precalculated supercomputer data for structural components like simulated egosto solve FEA models in seconds.

When app users plugged in custom parameters for problems such as the diameter of that spherical obstacle the app would compute a solution for the new parameters by referencing the precomputed data.

After that, the software will reference the precomputed data to create a highly detailed 3-D simulation in seconds.

Ultimately, pushing the data to the cloud has helped Akselos, by leveraging the age-old tradeoff between speed and storage:

By storing and reusing more data, algorithms can do less work and hence finish more quickly. hese days,

with cloud technology, storing lots of data is no big deal. We store a lot more data than other methods,

but that data, in turn, allows us to go faster, because wee able to reuse as much precomputed data as possible,

he says. Bringing technology to the world Akselos was founded in 2012, after Knezevic and Huynh,

along with Leurent who actually started FEA work with Patera group back in 2000 earned a Deshpande innovation grant for their upercomputing-on-a-smartphoneinnovation. hat was a trigger,


newsoffice 00195.txt

That corresponds to five thousandths of a pixel in a close up image, but from the change of a single pixel color value over time,

it possible to infer motions smaller than a pixel. Suppose, for instance, that an image has a clear boundary between two regions:

Everything on one side of the boundary is blue; everything on the other is red.

If, over successive frames of video, the blue region encroaches into the red region even less than the width of a pixel the purple will grow slightly bluer.

Putting it together Some boundaries in an image are fuzzier than a single pixel in width, however.


newsoffice 00197.txt

The data shows a loss of almost 1 percent of efficiency per week. But at present even in desert locations the only way to counter this fouling is to hose the arrays down a labor-and water-intensive method.


newsoffice 00201.txt

The difficulty with this approach is that simulating a single pixel in the virtual image requires multiple pixels of the physical display.

So the physical pixels projecting light to the right side of the pupil have to be offset to the left

and the pixels projecting light to the left side of the pupil have to be offset to the right.

The use of multiple on-screen pixels to simulate a single virtual pixel would drastically reduce the image resolution.

The algorithm that computes the image to be displayed onscreen can exploit that redundancy allowing individual screen pixels to participate simultaneously in the projection of different viewing angles.

In the researchers prototype however display pixels do have to be masked from the parts of the pupil for which they re not intended.


newsoffice 00202.txt

which has amassed a vast facial-expression database is also setting its sights on a mood-aware Internet that reads a user s emotions to shape content.

or human-to-human interaction point capture data and use it to enrich the user experience.

or confusion and pushes the data to a cloud server where Affdex aggregates the results from all the facial videos (sometimes hundreds)

Years of data-gathering have trained the algorithms to be very discerning. As a Phd student at Cambridge university in the early 2000s el Kaliouby began developing facial-coding software.

Back then the data that el Kaliouby had consisted access to of about 100 facial expressions gathered from photos

and training the algorithms by collecting vast stores of data. Coming from a traditional research background the Media Lab was completely different el Kaliouby says.

Already Affectiva has conducted pilot work for online learning where it captured data on facial engagement to predict learning outcomes.

To be able to capture that data in real time means educators can adapt that learning experience


newsoffice 00214.txt

and generated one-third of the DNA sequence data for that project the single largest contribution to the effort.

In the spirit of the Human genome Project the Broad makes its genomic data freely available to researchers around the world.

and disseminate discoveries tools methods and data openly to the entire scientific community. Founded by MIT Harvard


newsoffice 00219.txt

and robotic joint angles multiple times with various objects then analyzed the data and found that every grasp could be explained by a combination of two or three general patterns among all seven fingers.


newsoffice 00232.txt

#Own your own data Cellphone metadata has been in the news quite a bit lately but the National security agency isn t the only organization that collects information about people s online behavior.

So if we want the benefits of data mining like personalized recommendations or localized services how can we protect our privacy?

Their prototype system openpds short for personal data store stores data from your digital devices in a single location that you specify:

Any cellphone app online service or big data research team that wants to use your data has to query your data store

Sharing code not data The example I like to use is personalized music says Yves-Alexandre de Montjoye a graduate student in media arts and sciences and first author on the new paper.

you don t share data. Instead of you sending data to Pandora for Pandora to define what your musical preferences are it s Pandora sending a piece of code to you for you to define your musical preferences

and send it back to them. De Montjoye is joined on the paper by his thesis advisor Alex Sandy Pentland the Toshiba Professor of Media Arts and Sciences;

Although openpds can in principle run on any machine of the user s choosing in the trials data is being stored in the cloud.

In fact applications frequently collect much more data than they really need. Service providers and application developers don t always know in advance what data will prove most useful

so they store as much as they can against the possibility that they may want it later.

Openpds preserves all that potentially useful data but in a repository controlled by the end user not the application developer or service provider.

If we manage to get people to have access to most of their data and if we can get the overall state of the art to move from anonymization to interactive systems that would be such a huge win.

because it allows users to control their data and at the same time open up its potential both at the economic level


newsoffice 00252.txt

and data travels between cores in packets of fixed size. This week, at the International Symposium on Computer architecture, Peh group unveiled a 36-core chip that features just such a etwork-on-Chip in addition to implementing many of the group earlier ideas

or ensuring that coreslocally stored copies of globally accessible data remain up to date. In today chips, all the cores typically somewhere between two and six are connected by a single wire,

a local, high-speed memory bank in which it stores frequently used data. As it performs computations,

it updates the data in its cache, and every so often, it undertakes the relatively time-consuming chore of shipping the data back to main memory.

But what happens if another core needs the data before it been shipped? Most chips address this question with a protocol called noopy,

because it involves snooping on other corescommunications. When a core needs a particular chunk of data, it broadcasts a request to all the other cores,

and whichever one has the data ships it back. If all the cores share a bus,

then when one of them receives a data request, it knows that it the most recent request that been issued.

Similarly, when the requesting core gets data back, it knows that it the most recent version of the data.

But in a network-on-chip data is flying everywhere, and packets will frequently arrive at different cores in different sequences.

The implicit ordering that the snoopy protocol relies on breaks down. Daya, Peh, and their colleagues solve this problem by equipping their chips with a second network, which shadows the first.

The circuits connected to this network are very simple: All they can do is declare that their associated cores have sent requests for data over the main network.

But precisely because those declarations are so simple, nodes in the shadow network can combine them


newsoffice 00270.txt

#Who s using your data? By now most people feel comfortable conducting financial transactions on the Web.

As more of our data moves online a more pressing concern may be its inadvertent misuse by people authorized to access it.

At the same time tighter restrictions on access could undermine the whole point of sharing data. Coordination across agencies and providers could be the key to quality medical care;

which will automatically monitor the transmission of private data and allow the data owner to examine how it s being used.

At the IEEE s Conference on Privacy Security and Trust in July Oshani Seneviratne an MIT graduate student in electrical engineering and computer science and Lalana Kagal a principal research scientist at CSAIL will present a paper

With HTTPA each item of private data would be assigned its own uniform resource identifier (URI) a key component of the Semantic web a new set of technologies championed by W3c that would convert the Web from essentially a collection of searchable

text files into a giant database. Remote access to a Web server would be controlled much the way it is now through passwords and encryption.

But every time the server transmitted a piece of sensitive data it would also send a description of the restrictions on the data s use.

But HTTPA compliance could become a selling point for companies offering services that handle private data.

if it reuses data supplied by another HTTPA-compliant source. Suppose for instance that a consulting specialist in a network of physicians wishes to access data created by a patient s primary-care physician

and suppose that she wishes to augment the data with her own notes. Her system would then create its own record with its own URI.

But using standard Semantic web techniques it would mark that record as derived from the PCP s record

When the data owner requests an audit the servers work through the chain of derivations identifying all the people who have accessed the data and what they ve done with it.

Redundant storage of the same data on multiple servers serves two purposes: First it ensures that

if some servers go down data will remain accessible. And second it provides a way to determine

whether anyone has tried to tamper with the transaction logs for a particular data item such as to delete the record of an illicit use.

and filled it with data supplied by 25 volunteers. She then simulated a set of transactions pharmacy visits referrals to specialists use of anonymized data for research purposes

and the like that the volunteers reported as having occurred over the course of a year.

in experiments the system efficiently tracked down data stored across the network and handled the chains of inference necessary to audit the propagation of data across multiple providers.

In practice audit servers could be maintained by a grassroots network much like the servers that host Bittorrent files or log Bitcoin transactions s


newsoffice 00271.txt

That why KGS aims to ake buildings betterwith cloud-based software, called Clockworks, that collects existing data on a building equipment specifically in HVAC (heating, ventilation,

The software then translates the data into graphs, metrics, and text that explain monetary losses, where it available for building managers, equipment manufacturers,

is one of a few ventures gathering quipment-level data, gathered through various sensors, actuators, and meters attached to equipment that measure functionality.

Clockworks sifts through that massive store of data, measuring temperatures, pressures, flows, set points, and control commands, among other things.

it helps the software to rapidly equate data with monetary losses. hen we identify that there a fault with the right data,

Seeing building data as an emerging tool for fault-detection and diagnostics, however, they turned to Samouhosphd dissertation,

and a framework for an early KGS module. e all came together anticipating that the building industry was about to change a lot in the way it uses data,

where you take the data, you figure out what not working well, and do something about it,

Liberating data By bringing all this data about building equipment to the cloud, the technology has plugged into the nternet of thingsa concept where objects would be connected, via embedded chips and other methods, to the Internet for inventory and other purposes.

Data on HVAC systems have been connected through building automation for some time. KGS however, can connect that data to cloud-based analytics and extract eally rich informationabout equipment,

Gayeski says. For instance, he says, the startup has quick-response codes like a barcode for each piece of equipment it measures,

so people can read all data associated with it. s more and more devices are connected readily to the Internet,

Gayeski says. nd that data can be liberated from its local environment to the cloud, Grace adds.

and other sensors begins to nlock the data in the residential scale, Gayeski says, GS could adapt over time into that space, as well. o


newsoffice 00317.txt

High-speed 3-D imaging Neurons encode information sensory data motor plans, emotional states, and thoughts using electrical impulses called action potentials,

which currently takes a few minutes to analyze one second of imaging data. The researchers also plan to combine this technique with optogenetics,


newsoffice 00319.txt

Their system can receive data in the form of eight images per frame of video

Spreading pixels Oliver Cossairt, an assistant professor of electrical engineering and computer science at Northwestern University, once worked for a company that was attempting to commercialize glasses-free 3-D projectors. hat

which is what projection optics does moved essentially the pixels away from each other, Cossairt continues. hat allowed them to break this invariance


newsoffice 00323.txt

and nobody has given us labeled training data saying There are two pieces in a dive and seven pieces in a weightlifting and 21 pieces in a hammer throw and these are their names.


newsoffice 00334.txt

Van Voorhis used experimental data gathered in samples specially synthesized by Baldo and Timothy Swager, MIT John D. Macarthur Professor of Chemistry.

The researchers are pleased with the agreement between their experimental and theoretical data especially given the systems being modeled.


newsoffice 00340.txt

but it can benefit from a bunch of extra data that you haven labeled in detail. a


newsoffice 00354.txt

and provides real-time data thanks to using exoelectrogens as sensors. hese bugs are generating electricity,


newsoffice 00360.txt

Another important application for this test could be studying fundamental biological processes such as how cells recruit backup repair systems to fill in


newsoffice 00390.txt

The study covers prices of thousands of products, drawing on data from four major international firms:

Online pricing data was crapedusing a harvesting technique that Rigobon and Cavallo first developed for the illion Prices Project,

They would also like to collect more data illuminating how companies set prices when new goods are introduced first.


newsoffice 00400.txt

who was involved not in this research. hat nice here is that this is a comprehensive approach to try to get a good amount of data on many different cells.

The researchers believe there may be still more types of neurons that did not appear in their data set


newsoffice 00407.txt

and paper or spreadsheets to track contamination hich makes it nearly impossible to gather large amounts of data,

nature provides a rich database of phages which target desired bacteria. Thus, by sourcing from nature, we can adapt the platform to other pathogens and applications,


newsoffice 00427.txt

The startup also collects operational data from the vehicles to inform fleet managers of the best vehicles for the technology usually ones traveling in the stop


newsoffice 00469.txt

As a patient moves the robot arm, the robot collects motion data, including the patient arm speed, movement smoothness, and aim.

For the current study, the researchers collected such data from 208 patients who worked with the robot seven days after suffering a stroke,

The researchers created an artificial neural network map that relates a patient motion data to a score that correlates with a standard clinical outcome measurement.


newsoffice 00493.txt

but the MIT researchers are the first to use it to link fmri and MEG data from human subjects.

and twice in an MEG scanner giving the researchers a huge set of data on the timing and location of brain activity.

Millisecond by millisecond By analyzing this data, the researchers produced a timeline of the brain object-recognition pathway that is very similar to results previously obtained by recording electrical signals in the visual cortex of monkeys,

The MIT researchers are now using representational similarity analysis to study the accuracy of computer models of vision by comparing brain scan data with the modelspredictions of how vision works.


newsoffice 00517.txt

Also, the APA can function as a backup for a helicopter if something goes awry with the primary hoist:


newsoffice 00533.txt

and replace it with the most memorable one in our database we want your face to still look like you.

More memorable or lessthe system could ultimately be used in a smartphone app to allow people to modify a digital image of their face before uploading it to their social networking pages.

To develop the memorability algorithm the team first fed the software a database of more than 2000 images.


newsoffice 00550.txt

#3-D images with only one photon per pixel Lidar rangefinders which are common tools in surveying

and Computer science and lead author on the new paper explains the very idea of forming an image with only a single photon detected at each pixel location is counterintuitive.

The way a camera senses images is through different numbers of detected photons at different pixels Kirmani says.

each location in the grid corresponds to a pixel in the final image. The technique known as raster scanning is how old cathode ray tube-tube televisions produced images illuminating one phosphor dot on the screen at a time.

It guides the filtering process by assuming that adjacent pixels will more often than not have similar reflective properties


newsoffice 00553.txt

"Kadambi says. hat is because the light that bounces off the transparent object and the background smear into one pixel on the camera.

and returns to strike the pixel. Since the speed of light is known, it is then simple for the camera to calculate the distance the signal has travelled

when the data comes back, we can do calculations that are very common in the telecommunications world,

and apply sophisticated computation to the resulting data, Davis says. ormally the computer scientists who could invent the processing on this data can build the devices,

and the people who can build the devices cannot really do the computation, he says. his combination of skills


< Back - Next >


Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011