Synopsis: Ict: Data:


Nature 04482.txt

and staff to analyse the data and identify mutations that might be causing the undiagnosed diseases that afflict his clients families.

So Jalas, the centre s director of genetics resources and services, has outsourced parts of the analysis. He uploads his clients sequencing data to cloud-computing software platforms

And the cloud-based interfaces let him collaborate with doctors in Israel without worrying about repeatedly transferring data on slow Internet connections."

) Doctors will increasingly want to use sequen#cing data to guide decisions about patient care, but might not necessarily want to invest in staff

and software to make sense of those data.""It s a huge unmet need, says David Ferreiro, a biotechnology analyst with investment bank Oppenheimer & Company in New york,

Last year, Illumina opened Basespace Apps, a marketplace for online analysis tools to be used on data uploaded to the company s own cloud-computing platform.

and hospitals to analyse data. But one of the biggest questions will be how deeply analysis companies can reach into medical settings,

and clinical geneticists may be uneasy about uploading data to the cloud.""It s your licence and your lab that go on the line

Bina Technologies in Redwood City sells a server that can sit in a customer s own data centre

the range of customers who need to interpret sequence data is growing, and each has their own needs."


Nature 04483.txt

Lockheed has proven technologies and the most nodule-bed data. Polymetallic nodules form over thousands of years on the sea floor, through processes that are still not fully understood;

Data are so far sparse on the degree to which the operations would threaten deep-sea life such as sediment-dwelling sea cucumbers, worms and small crustaceans,

) Craig Smith, a deep-sea biologist at the University of Hawaii at Manoa, will lead an initial assessment of seafloor life for Lockheed s project, gathering baseline data for the potential harvest zone


Nature 04485.txt

In a series of hour-long experiments, each of which generated 1 terabyte (1 million megabytes) of data,


neurogadget.com 2015 000012.txt

The researchers say that the wireless BCI is able to stream thought commands via its radio at a rate of 48 megabits per second, about the speed of a home Internet connection.

The amount of data transmitted daily by the device equals about the amount of data stored on 200 DVDS.


neurosciencenews.com 2015 000010.txt

In order to address the big data challenge of the human brain, researchers at the SPECS lab of Paul Verschure have developed recently Brainx3,

which is a platform for visualization, simulation, analysis and interaction of large data, that combines computational power with human intuition in representing

Brainx3 serves as a hypotheses generator of big data. As is often the case with complex data,

one might not always have a specific hypothesis to start with. Instead discovering meaningful patterns and associations in big data might be a necessary incubation step for formulating well-defined hypotheses.

On this platform, the researchers reconstructed a large-scale simulation of human brain activity in a 3d virtual reality environment.


neurosciencenews.com 2015 000027.txt

Searching through the enormous database generated in the ALS study, Dr. Goldstein and his colleagues found several genes that appear to contribute to ALS,

we hope to be able to use the genetic data from each ALS patient to direct that person to the most appropriate clinical trials and,

ultimately, use the data to prescribe treatment. d


neurosciencenews.com 2015 000032.txt

#New Brain Mapping Reveals Unknown Cell Types Using a process known as single cell sequencing, scientists at Karolinska Institutet have produced a detailed map of cortical cell types and the genes active within them.


neurosciencenews.com 2015 000040.txt

the research team assembled a wealth of data that enabled them to model not only what happens during the progression of Alzheimer disease,

if one stage in the process was switched somehow off. e had reached a stage where we knew what the data should look like


neurosciencenews.com 2015 000068.txt

graduate students and postdoctoral fellows in Rao lab searched through patient databases to see if it had other effects on human health.


neurosciencenews.com 2015 000074.txt

These data identify somatic genomic changes in single neurons affecting known and unknown loci which are increased in sporadic AD


neurosciencenews.com 2015 000095.txt

and other aberrations. he exciting part is that this approach can acquire data at the same high speed per pixel as conventional microscopy,

000 times slower per pixel, says George Church, a professor of genetics at Harvard Medical school who was not part of the research team.


news.discovery.com 2015 01289.txt.txt

and exactly how these machines will collect facial recognition data. The United states currently does not use this kind of ATM biometric technology


news.discovery.com 2015 01321.txt.txt

The video analysis data was combined then with clinical input by caregivers to determine pain level scores for each patient.

and provide data for better administration of pain relief solutions


news.discovery.com 2015 01346.txt.txt

#Needle Injects Healing Electronics into the Brain Researchers have built a tiny mesh-like electronic sensor,


news.discovery.com 2015 01503.txt.txt

Pennsylvania State university geography professor Andrew M. Carleton and graduate student Jase Bernhardt studied April data from two weather stations, one in the South and the other in the Midwest,

Then they compared the daily temperature at sites with contrails above them with similar data from places where there weren any of the man-made clouds in the sky.


news.discovery.com 2015 01605.txt.txt

At the University of Virginia, researchers have unveiled a new way to transmit wireless data in light waves from LED LIGHTS a much more reliable and faster alternative to radio wave Wi-fi. DNEWS:

e developed a modulation algorithm that increases the throughput of data in visible light communications, Maite Brandt-Pearce,

an engineering professor at the University of Virginia, told Phys. org. e can transmit more data without using any additional energy.

The light waves can carry data at 300 megabits per second from LED fixtures to wireless devices.


news.discovery.com 2015 01645.txt.txt

600 degrees, contain illions of combinations of spectral data, Wuh said. Within each of the tiny particles is an elaborate nanopore structure think of it as a series of microscopic holes within a thin membrane,

continuous process. uh asserted that it is best to liken these tiny structures as specific ingerprintsor ignaturesof data. here are hundreds of millions of signatures

and each signature is tied to a database that provides you with a tremendous amount of information,

Wuh said the idea of tiny microscopic particles containing data about a drug is farfetched no more than someone 20 years ago saying that a person would have a upercomputer the size of his palm. ut,

with the interconnected age and concerns over personal data, are there any concerns from potential consumers over the idea of ingesting technology that contains so much information?

Wuh suggested that those queasy about the idea of literally consuming this kind of data can rest assured. here no privacy issues at all.


news.discovery.com 2015 01744.txt.txt

"We think the data might show a different view of who's really driving.""Get more from Toms Guidethis article originally appeared on Toms Guide.


news.discovery.com 2015 01750.txt.txt

the SCIO sends information to an online database. Algorithms interpret the data in the light spectrum,

and identification information is delivered back to your phone within seconds. As more people use the SCIO system,


news.sciencemag.org 2015 02994.txt.txt

LHCB collected the data back in 2011 and 2012, but Wilkinson's team held back from announcing their discovery to avoid the fate of those who had made the earlier claims of pentaquark sightings.

the LHCB collaboration made use of data showing not only the energy of the particles produced in the CERN collisions but also their directions.

Running these data through a computer model, they found that they could get the experimental results

Now, the fresh data that will flow into LHCB should enable scientists to study the pentaquarks'structure,

The new data might also lead to the discovery of other pentaquarks with different masses."


news.sciencemag.org 2015 03054.txt.txt

That makes them attractive as backup power sources for hospitals and manufacturing plants as well as for producing distributed power systems not connected to the grid.


newscientist 00055.txt

and send data home for analysis . But future spacecraft may be able to do it all on their own.


newscientist 00058.txt

and after data we have today. Hospital MRI machines can weigh more than a tonne thanks to their strong superconducting magnets making them impractical for the ISS.


newscientist 00065.txt

The pair intend to use their combined experience of space-based photographic databases and Earth observation privacy law to ensure that people can wield authentic imagery that stands up in court.

Finding the right pictures means trawling through huge databases of historical satellite data and lawsuits involving such approaches frequently fail.

because courts cannot be convinced of the authenticity of image data says Purdy. For instance people cannot be given sure a satellite was working on the day in question

The space detectives will use their expertise in commissioning space images to order and their familiarity with the databases of space image suppliers like Digital Globe of Longmont Colorado.

Because it is always possible to modify a digital image you need strong archiving procedures plus information on


newscientist 00082.txt

Ueda and her team made the observations using data from the ALMA radio telescope. Computer simulations suggested that


newscientist 00126.txt

Our experiment provides the first actual data of diamonds under such high pressure says Ray Smith at the Lawrence Livermore National Laboratory in California.

The team's data can now be used to improve models of gas giants and the suspected diamond in their depths.

because we can now use direct experimental data to model the deep interiors of carbon-rich planets says Madhusudhan.


newscientist 00150.txt

The appeal for Google and other firms is the potential to mine profitable data from satellite images.


newscientist 00176.txt

or by activity from the host star and they say a massive rocky world is the best explanation for the data.


newscientist 00216.txt

There were good data taken before during and after the supernova and none of these showed obvious signs of a foreground object says Quimby.


newscientist 00233.txt

Data from NASA's Galileo probe which orbited Jupiter from 1995 to 2003 show clay-like minerals on Europa's surface probably debris from meteor impacts


newscientist 00238.txt

Whether that for-the-camera useless blame game can translate into much needed political will to accelerate backup plans for ISS transport remains to be seen


newscientist 00240.txt

The subsurface-sea idea is just the simplest possible interpretation of the gravity data cautions William Mckinnon at Washington University in St louis who was involved not in the work.


newscientist 00289.txt

Laser signals carry more data but the light is almost undetectable by the time it reaches Earth. Now a nanoscale light detector could make such deep-space missives easier to read.

Data must be encoded before it can be sent. The most reliable way of doing this is to vary the time interval between light pulses with a long interval representing a 0 say


newscientist 00356.txt

But data from orbiters support the idea that the rocks and shadowed craters at both poles contain millions or even billions of tonnes of water ice.


newscientist 00389.txt

I'd say the data are equivocal at the moment says John Mustard of Brown University in Providence Rhode island.


newscientist 00440.txt

LLCD will beam signals to Earth at 622 megabits per second six times as fast as is currently possible from the moon.

Joseph Kahn of Stanford university in California also acknowledges the need for higher bandwidth in returning ever larger amounts of data from space missions.


newscientist 00453.txt

Two years'worth of data still need inspecting including information about the thousands of stars in its field of view.

Fabienne Bastien of Vanderbilt University in Tennessee and colleagues used Kepler data to watch instead for flickers in starlight due to short-lived convection cells or granules on the star's surface.

or asteroseismology signals from sun-like stars says Jørgen Christensen-Dalsgaard of Aarhus University in Denmark who leads a consortium of researchers who analyse Kepler's starquake data.


newscientist 00493.txt

This is actually the first real data we have that gives us the shape of the tail.


newscientist 00513.txt

Now Guillem Anglada-Escudé of the University of Göttingen in Germany and his colleagues have reanalysed the original data


newscientist 00572.txt

The dummy contains instruments that will collect data about the launch to be transmitted back to mission managers before re-entry.


newscientist 00602.txt

The next drill scoop will have to wait until the planet comes back into Range in the meantime the science team has plenty of data to fuel new discoveries and daydreams.


newscientist 00648.txt

Roger Clowes of the University of Central Lancashire in Preston UK and colleagues discovered the structure using data from the Sloan Digital Sky Survey the most comprehensive 3d map of the universe.


newscientist.com 2015 00003.txt

Added to the time for the other trips your data must take across the rest of the internet,

"Will the space around Earth become crowded with all these satellites vying to route our data?"

So Cahoy and colleagues are working on using light to transfer data instead. Easier to focus


newsoffice 00003.txt

pixels are illuminated by a white LED backlight that passes through blue, red, and green filters to produce the colors on the screen.

With more light shining through the pixels, LCD TVS equipped with Color IQ produce 100 percent of the color gamut,


newsoffice 00011.txt

which uses sensor identification badges and analytics tools to track behavioral data on employees providing insights that can increase productivity.

and using that data to build a baseball team. But what if I could say Here s how you need to talk to customers here s how people need to collaborate with each other

Individuals can use that data to boost performance and a company can use that to help set up an environment where everybody s going to succeed.

Readers placed around an office collect the data and push it to the cloud. Individuals have access to their personal data via a Web dashboard

or smartphone but companies are given only anonymous aggregated results of patterns and trends in behavior.)

By combining this information with employee-performance data from surveys interviews and objective performance metrics Sociometric can pinpoint areas where management can build more productive offices in ways as surprising as providing larger lunch tables or moving coffee stations to increase interaction.

Accumulating more than 2000 hours of data and comparing that data with survey results they predicted with 60 percent accuracy that close-knit groups of workers who spoke frequently with one another were satisfied more

and got more work done more efficiently. They also found evidence of communication overload where high volumes of email due to lack of face-to-face interaction were causing some employees difficulty in concentrating


newsoffice 00012.txt

If you could turn the DNA inside a cell into a little memory device on its own


newsoffice 00025.txt

and analyzing vast quantities of data related to the diverse types of bacteria within the human body and their interactions with each other and the body s own cells and organs.

but our ability to translate this data into usable knowledge is lagging behind says Arup K. Chakraborty the Robert T. Haslam (1911) Professor of Chemical engineering Physics Chemistry and Biological engineering at MIT and director of the MIT Institute


newsoffice 00046.txt

Prepared to send sizeable chunks of data at any given time the amplifiers stay at maximum voltage eating away power more than any other smartphone component and about 75 percent of electricity consumption in base stations#and wasting

The AMO technology was a new transmitter architecture where algorithms could choose from different voltages needed to transmit data in each power amplifier

This could be done on the transmitting and receiving end of data transfers. This caught the eye of Astrom who had come to MIT after working in the mobile industry for 10 years looking for the next big thing.

but you could see how much data traffic would explode. Fleshing out a business plan from an

Future-proofing technologytoday Eta Devices major advantage is that its technology is able to handle ever-increasing data bandwidths.


newsoffice 00065.txt

along with several MIT graduate students, to test much smaller versions of the device in animals. he Deshpande funding was an absolutely critical element in getting the data necessary to raise capital for Taris,

Indeed, collecting clinical data is a major challenge in spinning biotechnology out of the lab, notes Cima,

Thanks to the data gathered from the study, Cima and Langer were able to launch Taris, with Lee as chief scientist,

collected the data to determine it was thought feasible, I it was something that could make a big impact.


newsoffice 00086.txt

and other packets of data between continents, all at the speed of light. A rip or tangle in any part of this network can significantly slow telecommunications around the world.

and data loss. e now have a set of design guidelines that allow you to tune certain parameters to achieve a particular pattern,


newsoffice 00093.txt

social media, data streams, and digital content. Pattern discovery and data visualization will be explored to reveal interaction patterns and shared interests in relevant social systems,

while collaborative tools and mobile apps will be developed to enable new forms of public communication and social organization.

and data can be an effective catalyst for increasing accountability and transparency creating mutual visibility among institutions and individuals.""


newsoffice 00103.txt

The findings need to be validated with more data and it may be difficult to develop a reliable diagnostic based on this signature alone Vander Heiden says.


newsoffice 00144.txt

so it can collect all that data and display that info to a user, says Downey, Airware CEO,

and collects and displays data. Airware then pushes all data to the cloud, where it aggregated and analyzed,

and available to designated users. If a company decides to use a surveillance drone for crop management, for instance,


newsoffice 00150.txt

or wireless gloves to seamlessly scroll through and manipulate visual data on a wall-sized, panoramic screen.

Putting pixels in the room G-speak has its roots in a 1999 MIT Media Lab project co-invented by Underkoffler in Professor Hiroshi Ishii Tangible Media Group called uminous Room,

which enabled all surfaces to hold data that could be manipulated with gestures. t literally put pixels in the room with you,

and the data could interact with, and be controlled by, physical objects. They also assigned pixels three-dimensional coordinates.

Imagine, for example, if you sat down in a chair at a table, and tried to describe where the front,

and this much in front of me, among other things, Underkoffler explains. e started doing that with pixels.

nd the pixels surrounded the model, Underkoffler says. This provided three-dimensional spatial information, from which the program casted accurate, digital shadows from the models onto the table.

shared-pixel workspace has enormous value no matter what your business is. Today, Oblong is shooting for greater ubiquity of its technology.


newsoffice 00156.txt

Authoritatively answering that question requires analyzing huge volumes of data which hasn t been computationally feasible with traditional methods.

So the researchers also analyzed the data on the assumption that only trips starting within a minute of each other could be combined.

In analyzing taxi data for ride sharing opportunities Typically the approach that was taken was a variation of the so-called traveling-salesman problem Santi explains.

Unfortunately the traveling-salesman problem is also an example indeed perhaps the most famous example of an NP-complete problem meaning that even for moderate-sized data sets it can t (as far as anyone knows) be solved in a reasonable amount of time.

Next the algorithm represents the shareability of all 150 million trips in the database as a graph.

if it ran on a server used to coordinate data from cellphones running a taxi-sharing app.

whereas the GPS data indicated that on average about 300 new taxi trips were initiated in New york every minute.

Finally an online application designed by Szell Hubcab allows people to explore the taxi data themselves using a map of New york as an interface.

David Mahfouda the CEO of the car-and taxi-hailing company Bandwagon whose business model is built specifically around ride sharing says that his company hired analysts to examine the same data set that Santi

Making the entire data set available on a queryable basis does seem like a significantly larger lift.


newsoffice 00159.txt

Algorithms distinguish the pills by matching them against a database of nearly all pills in circulation.

If a pill isn t in Medeye s database because it s new for instance the system alerts the nurse who adds the information into the software for next time.

That s when we realized what a change it would be for a hospital to collect data


newsoffice 00169.txt

that uses precalculated supercomputer data for structural components like simulated egosto solve FEA models in seconds.

When app users plugged in custom parameters for problems such as the diameter of that spherical obstacle the app would compute a solution for the new parameters by referencing the precomputed data.

or a complex mechanical part. nd this creates a big data footprint for each one of these components,

After that, the software will reference the precomputed data to create a highly detailed 3-D simulation in seconds.

Ultimately, pushing the data to the cloud has helped Akselos, by leveraging the age-old tradeoff between speed and storage:

By storing and reusing more data, algorithms can do less work and hence finish more quickly. hese days,

with cloud technology, storing lots of data is no big deal. We store a lot more data than other methods,

but that data, in turn, allows us to go faster, because wee able to reuse as much precomputed data as possible,

he says. Bringing technology to the world Akselos was founded in 2012, after Knezevic and Huynh,

along with Leurent who actually started FEA work with Patera group back in 2000 earned a Deshpande innovation grant for their upercomputing-on-a-smartphoneinnovation. hat was a trigger,


newsoffice 00195.txt

That corresponds to five thousandths of a pixel in a close up image, but from the change of a single pixel color value over time,

it possible to infer motions smaller than a pixel. Suppose, for instance, that an image has a clear boundary between two regions:

Everything on one side of the boundary is blue; everything on the other is red.

If, over successive frames of video, the blue region encroaches into the red region even less than the width of a pixel the purple will grow slightly bluer.

Putting it together Some boundaries in an image are fuzzier than a single pixel in width, however.


newsoffice 00197.txt

The data shows a loss of almost 1 percent of efficiency per week. But at present even in desert locations the only way to counter this fouling is to hose the arrays down a labor-and water-intensive method.


newsoffice 00201.txt

The difficulty with this approach is that simulating a single pixel in the virtual image requires multiple pixels of the physical display.

So the physical pixels projecting light to the right side of the pupil have to be offset to the left

and the pixels projecting light to the left side of the pupil have to be offset to the right.

The use of multiple on-screen pixels to simulate a single virtual pixel would drastically reduce the image resolution.

The algorithm that computes the image to be displayed onscreen can exploit that redundancy allowing individual screen pixels to participate simultaneously in the projection of different viewing angles.

In the researchers prototype however display pixels do have to be masked from the parts of the pupil for which they re not intended.


newsoffice 00202.txt

which has amassed a vast facial-expression database is also setting its sights on a mood-aware Internet that reads a user s emotions to shape content.

or human-to-human interaction point capture data and use it to enrich the user experience.

or confusion and pushes the data to a cloud server where Affdex aggregates the results from all the facial videos (sometimes hundreds)

Years of data-gathering have trained the algorithms to be very discerning. As a Phd student at Cambridge university in the early 2000s el Kaliouby began developing facial-coding software.

Back then the data that el Kaliouby had consisted access to of about 100 facial expressions gathered from photos

and training the algorithms by collecting vast stores of data. Coming from a traditional research background the Media Lab was completely different el Kaliouby says.

Already Affectiva has conducted pilot work for online learning where it captured data on facial engagement to predict learning outcomes.

To be able to capture that data in real time means educators can adapt that learning experience


newsoffice 00214.txt

and generated one-third of the DNA sequence data for that project the single largest contribution to the effort.

In the spirit of the Human genome Project the Broad makes its genomic data freely available to researchers around the world.

and disseminate discoveries tools methods and data openly to the entire scientific community. Founded by MIT Harvard


newsoffice 00219.txt

and robotic joint angles multiple times with various objects then analyzed the data and found that every grasp could be explained by a combination of two or three general patterns among all seven fingers.


newsoffice 00232.txt

#Own your own data Cellphone metadata has been in the news quite a bit lately but the National security agency isn t the only organization that collects information about people s online behavior.

At the same time a host of recent studies have demonstrated that it s shockingly easy to identify unnamed individuals in supposedly anonymized data sets even ones containing millions of records.

So if we want the benefits of data mining like personalized recommendations or localized services how can we protect our privacy?

Their prototype system openpds short for personal data store stores data from your digital devices in a single location that you specify:

Any cellphone app online service or big data research team that wants to use your data has to query your data store

Sharing code not data The example I like to use is personalized music says Yves-Alexandre de Montjoye a graduate student in media arts and sciences and first author on the new paper.

you don t share data. Instead of you sending data to Pandora for Pandora to define what your musical preferences are it s Pandora sending a piece of code to you for you to define your musical preferences

and send it back to them. De Montjoye is joined on the paper by his thesis advisor Alex Sandy Pentland the Toshiba Professor of Media Arts and Sciences;

Although openpds can in principle run on any machine of the user s choosing in the trials data is being stored in the cloud.

In fact applications frequently collect much more data than they really need. Service providers and application developers don t always know in advance what data will prove most useful

so they store as much as they can against the possibility that they may want it later.

Openpds preserves all that potentially useful data but in a repository controlled by the end user not the application developer or service provider.

If we manage to get people to have access to most of their data and if we can get the overall state of the art to move from anonymization to interactive systems that would be such a huge win.

because it allows users to control their data and at the same time open up its potential both at the economic level

I don t see another way of making big data compatible with constitutional rights and human rights s


< Back - Next >


Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011