By 2010 its database contained 150 billion soil observations and 10 trillion weather-simulation points.
The Climate Corporation planned to use these data to sell crop insurance. But last October Monsanto bought the company for about $1 billionne of the biggest takeovers of a data firm yet seen.
Monsanto, the world largest hybrid-seed producer, has a library of hundreds of thousands of seeds,
and terabytes of data on their yields. By adding these to the Climate Corporation soil-and-weather database,
it produced a map of America which says which seed grows best in which field, under what conditions.
Fieldscripts uses all these data to run machines made by Precision Planting, a company Monsanto bought in 2012,
Monsanto, loaded with data, can plant a field with different varieties at different depths and spacings, varying all this according to the weather.
to boost its farm-data business. The benefits are clear. Farmers who have tried Monsanto system say it has pushed up yields by roughly 5%over two years,
The seed companies think providing more data to farmers could increase America maize yield from 160 bushels an acre (10 tonnes a hectare) to 200 bushelsiving a terrific boost to growersmeagre margins.
But the story of prescriptive planting is also a cautionary tale about the conflicts that arise when data entrepreneurs meet old-fashioned businessfolk.
They fear that the stream of detailed data they are providing on their harvests might be misused.
the prescriptive-planting firms might even use the data to buy underperforming farms and run them in competition with the farmers;
or the companies could use the highly sensitive data on harvests to trade on the commodity markets,
and control their data; that companies may not use the information except for the purpose for
Also, once data have been sent and anonymised, farmers might be said no longer to own them, so it is not clear
to negotiate with the data providers. Another worry is that, since the companies have not yet made the data fully ortable farmers may become locked into doing business with a single provider.
To assuage all these concerns the Climate Corporation has set up a free data storage service for farmers,
which others cannot access without the farmerspermission. New niche data-management firms are entering the market,
which should help make it more competitive. For the time being, though, the biggest companies will dominate prescriptive planting.
They collect the most comprehensive data and make better use of them than anyone else.
And that raises a problem which affects big data in all its forms. Prescriptive planting could boost yields everywhere,
. according to data presented this past week at the NCWIT summit for Women in IT by Ed Lazowska, the Bill & Melinda Gates Chair in Computer science and Engineering at the University of Washington,
and offers up-to-date data on the wearer mental and physical tiredness by linking the MEME to a smartphone feed.
changes in voltage are collected then into data that is measured for parameters such as alertness or fatigue.
and was used for collecting data as well as video t would dramatically change medical education. Imagine an attending physician seeing what you saw during a simulation,
but he looks forward to incorporating data calling up a patient electronic health record, for instance
Large scale Disruptions Most data are unstructured. Everyday there are millions of social media posts, customers filling out comment cards, mentions in mass media and on blog posts.
Because all of the data is housed together rather than distributed on clientsservers, the algorithms are able to learn
and improve from a much wider data set. Semantria business is booming, growing at 20%per month.
Further understanding of crop variability, geolocated weather data and precise sensors should allow improved automated decision-making and complementary planting techniques.
analyze it and compare it to a database of known spectral signatures, and display the information in an easy-to-understand manner.
In turn, the readings provided by users will make the spectral signature database more complete. Consumer Physics has developed three different applications for identifying food, medicines, and plants.
is also populating the first databases and apps that work with the SCIO, hopefully other companies will build their own apps,
we have developed software allowing you to apglove data to musical control signals (e g. MIDI and OSC.
Gestural data interface have been around for decades and have been used for many different applications (see our review of other glove systems).
Along with real-time video chats with Mayo Clinic nurses, the new service also includes personally-tailored health information culled from Mayo Clinic databases
particularly where the company has attempted to negotiate ero-ratedeals that allow customers to use some of Facebook offerings without it counting against their data plans.
But one oft-overlooked area where UAS technology could really be a boon for Facebook is in data moving the other direction.
Right now, Facebook owns mountains of data on its users, but relatively little on the parts of the world that aren connected already to Facebook.
Comparatively, Google acquisitions and exclusive deals with third parties provide it with everything from the rich trove of geospatial data that powers Google maps to the energy use
With a fleet of UAS in the sky, Facebook could begin gathering its own proprietary geospatial data, aerial imagery, traffic data,
meteorological data information that it could then integrate into new products or sell to companies that need it,
technologies that have to do with crunching all the data that you get from all these things those are the weapons you need to have with you going into the next competitive battles
the era of suitcase-schlepping may soon be over. magine design is just data, and products could travel through the Internet as code,
The main focus with this ruling is to ensure that you pay the same for calls, texts and data across the EU,
And it also about learning and gathering data. And I can also make tooling that can use more conventional manufacturing. t not just about making a final part,
you instantly get data coverage, electricity and local weather data, even if your previous system involved a sun dial and carrier pigeons.
Bristol, England The Connecting Bristol digital partnership leads the city work on next generation broadband infrastructure, smart city, open data, green IT and digital inclusion,
Glasgow aims to open up data to demonstrate how providing integrated health, transport, energy and public safety services can improve both the local economy and the quality of life for the city citizens.
Reacting to Big data However Bitcoin is not just a currency that promises to eventually end the trend of patchwork national currencies that exist for the almost sole purpose of allowing governments to endlessly fund their own deficit spending.
as well as many others, all keep meticulous track of user data for advertising and other purposes.
On several occasions, the massive amounts of data collected by Internet service and telecommunications companies have been utilized by agencies such as the NSA, under morally questionable motives at best.
and then provide all of that data to one central authority you may or may not trust, all with little choice for the consumer.
intentionally or not, is a massive reaction to the trend of Big data. It is decentralized and anonymous by design,
and it is these key features within the Bitcoin protocol itself that may be the key to weakening the hold of massive data collecting service companies like Google.
The U s. sometimes programs its semi-autonomous drones to locate a terrorist based on his cell phone SIM CARD.
The terrorists, knowing that, often offload used SIM CARDS to unwitting civilians. Would an autonomous killing machine be able to plan for such deception?
Our data pave the way for such a stem cell therapy, Blau said. Other recent work in stem cell therapy has looked similarly for ways the cells could improve functioning of existing organs, rather than building replacements through regenerative medicine.
and other financial agents just as peer-to-peer file sharing did the music industry #and some of the architects of this financial Napster seem gleeful about the possibility.
and even data-conditional transactions, in which a script uses a data input such as a regular Google search to monitor real-world events that would automatically trigger disbursements or other actions.
Alternately, Antonopoulos suggests thinking of thebitcoin blockchain as#having an API#(application programming interface) that makes its data usable by third parties,
and interact with Twitter data in slightly modified or reorganized forms. Efforts to make complex financial functions a part of Bitcoin have been bubbling through 2013,
The price paid for that flexibility is need the to ensure a sufficient degree of overlap so the CFD solver can accurately share data between the component grids.
Interconnect performance capabilities are critical to data -and compute-intensive applications which require ultra-low latency
you can transmit a whole Blu-ray disc of data in two seconds. Connectx-4 opens new capabilities for high-performance, data analytics, machine learning and storage applications.
With the new records of interconnect performance, Connectx-4 adapters provide the means to increase data center return on investment while reducing IT costs.
Connectx-4 has already been selected to power CORAL (Collaboration of Oak ridge, Argonne and Lawrence Livermore National Labs) to help ease the Department of energy new mission-critical applications.
#Seagate & Supermicro Optimize SQL Databases As part of our strategy and commitment to engage with ecosystem vendors to jointly bring optimized
Microsoft SQL SERVER 2014 Fast Track is a program to develop reference architectures to give medium to large data warehouses a step-by-step guide to building out a data warehouse using well-tuned hardware.
and grow while maintaining a common experience across the data warehouse. Our efforts are targeted at supporting organizations that seek to optimally deploy SQL SERVER 2014 flawlessly in their data center,
said Tiffany Wissner, senior director product marketing, data platform, Microsoft. ee proud to work with companies such as Seagate
and Supermicro who continue to innovate and deliver highly reliable and high performance Microsoft Fast Track Data warehouse designs.
With this solution companies will be able to more efficiently support real time reporting and streaming data,
which are now the norm for business expectations and productivity. Predicting the performance capabilities of a collection of components can be difficult
and a reference architecture helps to eliminate any uncertainty in determining the right mix of components to best suit the specific needs of a data warehouse.
Supermicro Superserver SYS-4048b-TRFT system and Microsoft SQL SERVER 2014 Enterprise Edition to deliver a fully-optimized,
efficient deployment of Microsoft SQL SERVER 2014 Fast Track Data warehouse, said Don Clegg, vice president of marketing and business development at Supermicro. ur Microsoft certified solution comes preconfigured with quad Intel Xeon E7 processors, Seagate Nytro Flash Accelerator cards, memory,
storage and SQL SERVER 2014 installed and validated. With this turnkey solution, organizations can focus efforts on more strategic decision making activities around data analytics
and business intelligence to accelerate growth. For more information please visit http://www. seagate. com and Seagate and Microsoft Fast Track.
Information on Microsoft SQL SERVER Fast Track Data warehouse reference architecture can be found at Microsoft Solutions o
The North Central Sustainable agriculture Research and Education (SARE) program and the Conservation Technology Information center conducted the survey of more than 759 commercial farmers from winter 2012 through spring 2013.
which uses extensive data from a farmer's field and the surrounding region to help predict weather conditions
While collecting real-time data on weather soil health of crops and air quality is important as is the availability of equipment
At IBM we developed a precision agriculture weather-modeling service using Deep Thunder our Big data analytics technology for local customized high-resolution and rapid weather predictions.
It gathers data from sensors placed throughout fields that measure the temperature and moisture levels in soil and surrounding air.
The system then combines the field data with a diversity of public data from the National oceanic and atmospheric administration the National aeronautics and space administration and the U s. Geological Survey and private data from companies like Earth Networks.
A supercomputer processes the combined data and generates a four-dimensional mathematical model derived from the physics of the atmosphere.
By combining supercomputing and Big data analytics with other technological innovations even farmers with modest means can bolster production and profits.
All it needs is a resolution of at least 40 by 40 pixels. Smile Secrets: 5 Things Your Grin Reveals About You Using Facet on a video sequence produces even more interesting results
In these types of cases the software does need images clearer than 40 pixels but the required resolution is still within a common webcam's capabilities.
ADAMM transmits its data wirelessly to a mobile app, where it can remind the wearer to take their medication,
and communicate the data that is measured. Currently MEMS readers measure and communicate information electronically, which is subject to interference from electrical oisefrom nearby devices and the environment.
and iteratively with the seismic and well data required to model the structure rock and fluid properties.
and imaging to interpretation and modeling, reservoir characterization, reservoir engineering and drilling and data management.
and technology that enable effective seismic data processing and imaging, velocity modeling and seismic interpretation. In this release, significant enhancements were made in the Paradigm processing
and sensors into a desirable geometry with 5d data reconstruction Enhancing fracture determination from seismic data with improved full azimuth imaging
In this release, Seisearth interpreters have access to a wealth of seismic inversion and data analysis functionality within the application.
seismic facies classification and on the fly-fly attribute calculation Identifying potential hydrocarbon locations with AVA inversion and QC of pre-stack seismic data Predicting flow for well planning
With Paradigm seismic processing and imaging solutions we are able to extract that detail by building velocity models that honor available geologic data
The center is working on something like Big data for smart batteries turning these mysterious devices into information centers that according to doctoral student Mohammad Rezvani can tell their users
If every part of the pack can be monitored with the kind of Big data equipment that now hugely in vogue one bad apple won spoil the whole bunch
and perform data operations at the same time. In contrast, IBM's new chip architecture resembles that of a living brain.
which minimizes the time needed to transmit data, Modha said. Kwabena Boahen, an electrical engineer at Stanford who led the development of the Neurogrid system,
and range extender REX cars) but BMW Healey says the data isn there yet. he i3 has only been on the market for two
The data collected from these trial runs will be used to develop new laser weapons for the Navy under the Office of Naval Research's Solid-state laser-Technology Maturation program.
#Scientists achieve quantum teleportation of data with 100 percent accuracy Dutch scientists working with the Kavli Institute of Nanoscience at the Delft University of Technology have made a stunning breakthrough in quantum technology
by successfully teleporting data across a distance of about 10 feet with perfect accuracy reports the New york times. The advance ought to have Albert Einstein who famously dismissed the idea of quantum teleportation as spooky action at a distance rolling in his grave.
The team compared these gene activity results with data from other species in particular the mouse brain.
Panasonic Eco Ideas House, with solar, a fuel cell, battery backup and a plug-in Toyota prius, has stood long next to a company headquarters in Tokyo,
and will share the data with the CSA for possible use on future space missions and other applications.
and conductivity allowing them to transmit data on the environmental factors they experience. Javey said the team's e-whiskers could lead to technologies for real-time monitoring of environmental factors.
#Quantum computer technology now capable of holding data with 99 percent accuracy Perhaps the zaniest property of quantum mechanics is that of entanglement,
and transfer data instantaneously, but learning to control the quantum data has proven difficult. The latest breakthrough in quantum computing,
however, brings the technology much closer to reality. Australian scientists have developed the first silicon quantum technology capable of holding data with over 99 percent accuracy
reports PC Mag. It's particularly significant because silicon is the same material used to build conventional computers,
Taken together, the two methods improved the reliability of data retention from just 50 percent to over 99 percent
Researchers at the University of Maryland, funded by the DARPA's Mathematics of Sensing, Exploitation and Execution (MSEE), are teaching robots how to process visual data
and capable of delivering 50 megabits per second Internet access. As a comparison the average global broadband speed currently is only 21.9 Mbps. Oneweb is expected to cost between $1. 5 billion to $2 billion a relative bargain compared to
If there are data to back up these decisions it's based on an engineer traffic counts or an estimator dollar estimates.
instead there are hard data on how much it will cost to treat the asthma cases that are caused by the extra pollution.
If you put in a bike lane instead of a car lane there are data that show how it increases the health of the population.
or rip out another bike lane there are real data that people can show on the true costs both economic and social.
and concrete costs but that now can also produce real data on social value pollution health and yes even happiness a
what action is called for in the face of the data presented by the study.""If you attempted to go out
These data will be combined with samples taken over the next four years at 60 randomly chosen sites across Sub-saharan africa,
The small radio-frequency signals given off by the recovering nuclei provide the imaging data.
The data may also reveal the viral characteristics that are associated with longer more symptomatic infections."
The government agencies Environment Canada and Health Canada will use the data to make risk assessments for the materials
Voluntary data-reporting schemes have been trialled in other countries with limited success. The ongoing voluntary programme of the US Environmental protection agency (EPA) has received so far submissions from 29 companies on more than 120 nanoscale materials;
and is pleased that companies are expected to provide their data within four months of beginning to use
which include getting city agencies to provide public data in standard formats such as XML and RSS feeds.
which stores data and applications online rather than on individual computers. And he recently implemented an internal management tool in
"We want to launch a data. gov site to make a vast array of government data public."
Likewise, any data streams coming out of data. gov will have to comply with legislation that protects citizens'privacy y
#Camera shoots and compresses image in one go Today s cameras take images as'raw'arrays of pixels,
which can take up many megabytes of storage, and then use data-compression algorithms such as JPEG to store the image in a smaller file."
"Why collect that data in the first place? asks John Hunt, an engineer at Duke university in Durham, North carolina,
The trick is figuring out what data to acquire, says Richard Baraniuk, an electrical and computer engineer at Rice university in Houston,
But it is not possible for a device to know what is important before the data are recorded.
Compressive sensing works by sampling at random##eliminating the need to sift through data ##and it still produces enough information to generate a good image.
In 2006 Baraniuk and his colleagues made the first optical camera that produced multi-pixel images with a single-pixel sensor.
An algorithm compiles data from each wavelength to generate a complete image every 0. 1 seconds.
A team member started analysing the data on the drive back from the missile range, and immediately saw evidence of braids in the twists of coronal gas.
The researchers say that their technique could easily be scaled up to store all of the data in the world.
a photo of the researchers'institute and a file that describes how the data were converted.
For example, CERN, the European particle-physics lab near Geneva, currently stores around 90 petabytes of data on some 100 tape drives.
Goldman s method could fit all of those data into 41 grams of DNA. This information should last for millennia under cold,
Goldman s group developed a more complex cipher in which every byte#a string of eight ones
The system encodes the data in partially overlapping strings, in such a way that any errors on one string can be checked cross against three other strings.
400 to encode every megabyte of data, and $220 to read it back. However, these costs are falling exponentially.
but that will rarely be accessed, such as CERN s data. If costs fall by 100-fold in ten years,
if you want to store data for at least 50 years. And Church says that these estimates may be too pessimistic,
The data in the new study only show how the nano-twinned boron nitride responded to indentation loads with up to seven newtons of force."
In that work, published in Applied Physics Letters, Dubrovinskaia and her colleagues presented data from Vickers testing with loads of up to 10 newtons n
better behaved and backed by an extra decade s worth of data#promise to have an important supporting role.
but was concerned that much of the data depended on the actions of ZIP. He and his collaborators took a different route,
"I think the future will be to try to find the backup mechanisms for memory. However, Huganir s team also created mice whose PKM?
when James Van allen first spotted them using satellite data half a century ago, and that s also the structure that NASA s twin Van allen Probes recorded
Baker goes on to say that data collected by the probes on 9 october revealed that"suddenly the whole outer belt was lit up again but with the middle ring gone.
But advances in sequencing have allowed biologists to accumulate large data sets of RNA sequences including some from RNA without tails.
And when Rajewsky and his colleagues mined databases for circular RNA molecules, they found thousands in nematode worms, mice and humans."
The committee drew heavily on clinical data, but also took extrapolations from basic research and post hoc analyses of clinical trials.
and to focus on data from randomized clinical trials, says committee chairman Neil Stone, a cardiologist at Northwestern University School of medicine in Chicago.
When satellite data suggested that Cape Darnley might be a candidate, the researchers moored instruments on the seabed,
In addition, they relied on data from elephant seals (Mirounga leonina) tagged with instruments that monitor ocean conditions."
using the video processing unit to transform images from the video camera into electronic data that is wirelessly transmitted to the retinal prosthesis.
Stanford university researchers are in the early stages of developing self-powered retinal implants where each pixel in the device is fitted with silicon photodiodes.
Patients would wear goggles that emit near-infrared pulses that transmit both power and data directly to the photodiodes.
although recombinases have been used similarly in the past#to write data into a DNA memory, for example#the latest work takes the idea a step further by making the DNA part of the computation itself."
A comparison with traditional surveillance data showed that Google Flu Trends, which estimates prevalence from flu-related Internet searches,
But as flu-tracking techniques based on mining of web data and on social media proliferate, the episode is a reminder that they will complement,
Based on research by Google and the CDC, it relies on data mining records of flu-related search terms entered in Google s search engine,
Its estimates have matched almost exactly the CDC s own surveillance data over time ###and it delivers them several days faster than the CDC can.
and some of its state data show even larger discrepancies. It is not the first time that a flu season has tripped Google up.
but it emerged clearly from the data set of 637 languages. This functional load'hypothesis had been viewed with some scepticism,
but all data are publicly available. Keeping data open and focusing on specific drug mechanisms makes his consortium s approach much simpler."
"Intellectual-property deals, assays coming from everywhere, multi-institutional agreements. Wow, that s hard, he says."
who has worked on data from Planck and the WMAP.""Instead, we ve got new evidence that this expansion did happen.
Those data have enabled cosmologists to work out when the Big bang happened, estimate the amount of unseen dark matter in the cosmos
These precise measurements show that the Universe is expanding slightly slower than estimated from WMAP's data.
The Planck data also implies that dark energy makes up 68.3%of the energy density of the Universe,
a slightly smaller proportion than estimated from WMAP data. The contribution of dark matter swells from 22.7%to 26.8%
however, raise tantalizing hints that there may yet be new physics to be discovered in Planck s data.
So far, the team has analysed about 15.5 months of data, and"we have about as much again to look at,
The team expects to release the next tranche of data in early 2014 a
#Serotonin receptors offer clues to new antidepressants Researchers have deciphered the molecular structures of two of the brain's crucial lock-and-key mechanisms.
Existing 3d-display technologies tend to have larger pixel sizes and therefore operate better at distances of a few metres,
But the pixels in that case tend to be larger he explains, because it is difficult to produce small lenses of high optical quality.
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011