which can transmit to an open data platform: Smartcitizen. methese sensor enhanced hive designs are open
and freely available online the data collected from each hive is published together with geolocations allowing for a further comparison and analysis of the hives.
The heavy lifting of the Zebedee is done in software as all that conflicting laser data is converted to a 3-D map.
--but that's not necessarily a bad thing as long as the software is capable of sorting out the redundant data in a reasonable time-frame which it sounds like this data is.
And to keep the weight down for processing this data the data could just be beam back to the drone remote site to be processed there.
All the data pointed to the same conclusion lead author Spencer Smith an assistant professor of neuroscience
and there is a lot of data to be processed and interpreted. Accuracy is difficult if the bed proves to be featured highly
Take the case of the bed data recently released. It has a spatial resolution of 1 km
The data was collected over several decades by NASA and researchers from the UK and Germany.
By analysing this radar data the team were able to map the topography of the underlying bedrock.
After that the researchers#let the patients experience their stay in the hosptial as they normally would using the electrodes to record data on the seizures as well as everything else they did during the hospital stay like eating or speaking.
Cameras monitored the patients from their rooms allowing the researchers to determine how the data they got from the electrodes matched up with
however cameraphone technology needs to support it in ways it currently doesn t. Cameraphones have improved dramatically in the last few years the Nokia Pureview sensor has 41 megapixels
and HTC s newest sensor has larger pixels that grab more light but they still suffer from one great shortfall:
The 16.3-megapixel Samsung galaxy Camera has a 4g radio and a 21-times zoom lens. And the newer 20.3-megapixel Galaxy NX has an interchangeable lens mount.
The Sony QX100 the newest offering in the lot is the most extreme example. The device is just a lens sensor
Using data from Kepler lead author Erik Petigura and his team analyzed 42000 G -and K-type stars visible to the naked eye from Earth.
if the shutdown continues the disruption to the Antarctic summer research season could be catastrophic. â##We have 22 years of data showing the summer snapshot in this area that s changing really rapidlyâ#says Oscar Schofield an oceanographer at Rutgers University. â
The whole point of a time series is to have continuous data so that you can talk about the trends in the system.
â##Once it s gone it s never coming back we lose this data foreverâ#he says. â##Because of the nature of our work where we re analyzing long time series of data
#Some scientists have been gathering data at Palmer for even longer. Bill Fraser of the Polar Oceans Research Group has been studying seabirds there since 1974.
and we re left with this critical missing year that you really don t know anything aboutâ#he says. â##It s a total blank in the database.
It s a very serious impact â#Fraser says a particularly critical database documents southern giant petrels.#
For this species skipping a year of data collection â##would be something from which there s almost no recoveryâ#he says. â##We would miss a group of first year breeders or young birds
Together this data could help police find explosives while they're still being manufactured. The team behind EMPHASIS announced earlier this month that the sensors had proved#successful#in#lab tests
#Wednesday at the Disaster City exercise first responders measured radiation levels with the app then practiced sending the data to officials through a wireless network.
The idea's that those officials will be able to make better-informed decisions more quickly with the data.
#How Scott Collis Is Harnessing New Data To Improve Climate models Each year Popular Science seeks out the brightest young scientists and engineers and names them the Brilliant Ten.
--The Editorsargonne National Laboratoryharnessing new data to improve climate modelsclouds are one of the great challenges for climate scientists.
But rudimentary data has simplified their role in simulations leading to variability among climate models. Scott Collis discovered a way to add accuracy to forecasts of future climate by tapping new sources of cloud data.
Collis has extensive experience watching clouds first as a ski bum during grad school in Australia and then as a professional meteorologist.
and Climate research he realized there was an immense source of cloud data that climate modelers weren't using:
So Collis took on the gargantuan task of building open-access tools that convert the raw data from radar databases into formats that climate modelers can use.
In one stroke Collis unlocked years of weather data. We were able to build such robust algorithms that they could work over thousands of radar volumes without human intervention says Collis.
When the U s. Department of energy caught wind of his project it recruited him to work with a new radar network designed to collect high-quality cloud data from all over the globe.
but already the data that Collis and his collaborators have collected is improving next-generation climate models.
In typical cloud computing users connect to a powerful centralized data center. But Cappos's cloud is less of a dense thunderhead and more of a fog.
The data he has gathered explains how pathogens ride on wind currents and provides a glimpse into an almost unknown ecosystem far above our heads.
With the data he has collected thus far Schmale has built a model of atmospheric circulation that shows large sections of air sweeping across the face of the planet like waves across an ocean transporting dust and microbes thousands of miles.
--The Editorsbell Labs Alcatel-Lucentsaving the Internet from itselfnearly all communications data Web phone television runs through a network of fiber-optic cables.
But within a decade data traffic is expected to outgrow infrastructure which will result in transmissions that are garbled slow
Nicolas Fontaine an optical engineer at Bell labs Alcatel-lucent has devised a clever way to avoid a data bottleneck.
in order to cram a lot more data into a single optical fiber. It works by routing different light beams called modes along carefully planned pathways;
he has shown already that his multiplexer can send six light streams down 497 miles of fiber without losing data along the way.
This chart represents all those listed in the CDC database as accidental poisoning intentional self-poisoning assault by drugs
Also the database doesn't include nonresidents either undocumented immigrants or U s. citizens living abroad.
in order to even show up as a single pixel wide line on this graph. Diverdan I believe'treatment related'would fall under unintentional self-harm the largest sub-category.
Roxana Geambasu Exposes How Companies Use Your Data As a computer scientist Roxana#Geambasu of Columbia University says she picks new projects based on
She hates ceding control of her personal data online which is why she s building software that allows people to see where the information they upload to the cloud goes.#
#In order to understand how companies share data Geambasu devises clever ways to track the repercussions. Her latest software uses a series of shadow accounts to see how ads change
Other programs she s designed make data self-destruct after a set period of time help users track
what information they ve entered where and limit data breaches from lost or stolen phones.
But this idea of services declaring what data they use is going to become extremely important.
As long as companies aren t saying what they do with users#information she ll work to make public how that personal data is shared.
Jonathan Viventi Builds Devices That Decode Thoughts Existing brain implants require individual electrodes to be wired to an external device for data processing.
or extend voice and data networks. C Three double-braided polymer tethers prevent the airship from drifting away.
D The ground station responding to sensor data from the BAT helps the craft hunt for optimal wind conditions around 30 mph.
#Satellite data Maps Sea floor's Hidden Depths While many detailed maps exist of Earth s continents
Harnessing never-before-used satellite altimeter data from the European space agency s (ESA) Cryosat-2 and NASA s Jason-1 the scientists have created stunning maps of Earth s entire seafloor bringing to light mountains
So we use that data to generate a topography of the ocean s surface. That topography highlights subtle variations and bumps in the oceans waters telling a lot about the surface underneath.
which has been taking high-resolution measurements of Earth's gravity for the past four years with those of the American-German orbiter GRACE which uses gravity data to measure changes in ice mass.
This animation shows the two data sets overlaid: Climate change is having other measurable impacts on the southern continent.
Data from the ESA's Cryosat satellite shows that West Antarctica's seasonal ice melt has sped up by a factor of three
From that data it predicts that the battery would lose 15 percent of its capacity after 10 years of daily use.
Patrick Meier director of social innovation at the Qatar Computing Research Institute applies artificial intelligence to this crowdsourced data organizing digital photos and messages into dynamic maps that can guide real-world
and create a live data set for every aspect of a space including the electrical engineering millwork and piping.
The individual beams are created the same way pixels on a projector are. There's a semiconductor chip that has an array of a million tiny mirrors on it.
The mirrors flip to modulate each pixel's brightness. This way the system can turn off some beams sometimes without the driver noticing too much.
Sensors in the helmet collect data on impact force linear or rotational acceleration and location.
when the glasses don't have a data connection which is nice. It could also keep the images the glasses record for the app more secure the images are supposed to stay on the device
and weak health infrastructure and where accurate data on the spread of the disease is difficult to come by.
covered in pixel camouflage, and furnished with stadium-seating for its two crew members, the Advanced High performance Reconnaissance and Surveillance Aircraft (AHRLAC) looks like an alternate history version of a World war i fighter.
other information the project has put online suggests that it has succeeded at consistently filtering out oiselike car vibrations from the sensor data,
With the increasing volumes of digitized art databases on the Internet comes the daunting task of organizing
to manage properly the databases of these paintings it becomes essential to classify paintings into different categories and sub-categories.
if we can infer new information about an unknown painting using already existing databases of paintings.
The data set contains 1710 high-resolution images of paintings by 66 artist spanning the time period of 1412-1996
We also collected a ground-truth data set for the task of artistic influences which mainly contains positive influences claimed by art historians.
#British National Oceanographic Centre launches major unmanned exploration mission Collecting oceanographic data is accomplished usually by a combination of satellites buoys
This data will help scientists map the distribution of the fronts and their associated fauna.
and is designed to gather oceanographic data over long periods of time. C-Enduroc-Enduro is an innovative long-endurance autonomous surface vehicle built by Autonomous Surface Vehicles (ASV.
and can remain at sea for months at a time while transmitting data back to shore. Two gliders are used in the NOC project
and to record different type of data. The Robotic Exploration of Ocean Frontsâ project is divided into two phases.
or a limit sensor. ith the majority of stepping motors a backup battery is needed to store position data
and because batteries have limited a life data can't be stored for a long time. But the AZ Series doesn't need a battery
so storing data isn't a concern even if the production equipment is stopped for a long time or the unit is shipped overseas. hat is so special about this new product?
and research developer of Openbci a low-cost open-source hardware platform that records the brain#s electrical signals and uses devices and software languages to make the data easily accessible.
This has implications on both cost and availability of data for machine perception and learning.
The ability for machines to tap into multiple online data sources and services allows them to quickly stitch together value reducing time
Location-based data financial and weather and increasingly healthcare data are all examples. They and many others active in the field also believe that the robotics future won t look much like the current mass-visage of the technology
onboard data storage and a 500 lumens LED projector. We are already working on more sophisticated features including live streaming data gimbal integration ultrasonic sound capabilities and improved flight time.
 It is not hard to imagine a slightly sci-fi future where micro UAVS with similar broadcasting abilities are used to deliver breaking news disaster and traffic management emergency warnings and maybe an advertisement or two.
#the big data revolution and energy-efficient computing;##satellites and commercial applications of space;##robotics and autonomous systems;#
Using a two-step approximation1â we showâ how this data can be combined with accurate sensor and receiver characteristics to calculate not only the bearing to the other robots but also its heading and eventually the range.
We are currently working on using this feature for data transmission enabling programming an entire swarm not only via infrared but directly via the floor.
which allows users to choose how much to simplify 3d models in the database during the deployment stage
In the not so distant future the fish-bot could be put to use for covert science missions where it might be able to infiltrate schools of real fish to collect data about their behavior.
and it was found that the brain automatically assimilated data from multiple sensors in the palm of the hand.
At the Ecocloud center at EPFL Dr. Edouard Bugnion and Professor Babak Falsafi are carrying out research into energy-efficient memory architectures for data centres that can handle huge amounts of data.
and IP instead in the same way that Wolfram is talking about owning the database of the internet of things.
and quantifies this into data which enables physical therapists to provide accurate instruction on movement and patients to confirm their own movements. his robot has a force sensor
Based on data collected from these sensors calculations are made on the force of muscle contraction within the arm as well as on the amount of calories consumed by each muscle during training.
In addition we use augmented reality technology to make these results visible. y databasing measurement data collected during training the Tsuji Lab is constructing a rehabilitation cloud system
and pulse waveform storing the data in the cloud. hen we talked to people they often said that measuring each health indicator separately is time-consuming and bothersome.
If possible people want to obtain all their health data in one go. The equipment needed to do that is still large but wee made it as compact as possible.
Our idea is that people could check their health data regularly in places they often visit
Because the data is saved all the physician can give suitable advice while looking at the information the user has obtained so far.?
Once inspection is complete the data is analysed and suspicious areas are marked. Available data from earlier inspections is used to detect changes
and the damage map is updated. The next time a ship is due for inspection the damage map is checked
since the inspectors already have access to the data they can identify critical problems in advance
and control tasks (with the exception of a microcontroller for bundling sensor data) are executed on a base station that runs the Robot Operating system.
The robot uses four Sharp infrared distance sensors for edge and obstacle detection and an Invensense MPU 6050 IMU provides the operator with orientation data.
So for people whoe been thought of as totally immobile until now this system could open up a lot more possibilities based on scientific data such as shoulder movements for example. nother option is called Face Switch
and to gather quantitative data about what mechanisms help people work with these swarms most effectively.
The force on the tip of the robot is estimated from the air pressure data and that information is sent to the surgeon master robot.
in the case where external sensors are used (e g. 3d motion capture system) the data can be sent to the units over this wireless link.
The only information that a unit needs for flight is its local sensor data and its position with respect to the vehicle centre of mass.
we needed a very large robotic data set of visually salient image features paired with corresponding robotic state estimates,
and it was simply infeasible to gather all this data by ourselves. That where the Parrot AR.
We thought we could develop a game for the Parrot that would serve as a means for crowdsourcing the data we needed.
data-gathering could be done in a gaming context, making it fun for people to participate in the experiment.
which can be used send data over the internet. The code for communicating with Parrot is open source. Methodology The question we faced experimentally was how to couple the real-world object to the virtual space in which the drone would be flying.
Figure 1 shows two configurations that give exactly the same marker detection data. In the edcase, the marker is in the left part of the image
sending and receiving data from the drone, performing state estimation (X y z) with respect to the marker and rendering the 3d world.
After extracting the vision data and concatenating this with the state estimates, the data is encrypted for transmission.
Finally, since some players may use 3g for sending their data we considered that the data should be as compact as possible.
Therefore, the data is compressed on board the iphone before being sent. The data from 5 textured images on average takes around 77 kb.
Testing and Launch Our original plan was to work on the game from February to April 2012,
launch it in May or June, and analyze the data from August to September. In reality, we didn converge on the final code until November 2012.
Of course, when you are developing the app, you know the program inside out and by habit you may avoid playing the game in such a way that would make the program malfunction.
So in November we invited other people try out and rate the app. The ratings and comments were used to perform some final adjustments,
and 458 of these people had contributed already to our experiment by sending their data. This is a nice start,
it cannot send its data. There was no message that showed the absence of an internet connection (solved in version 1. 2). At the same time
we have started analyzing all the data that is coming in. In the future, we may also improve our crowdsourcing game in various ways,
In any case, it an adventure to work with this exciting new way of gathering robotic data n
#Kubi telemedicine device gets HIPAA clearance for streaming medical data Revolve Robotics and Swymed have collaborated to create a HIPAA compliant telepresence device called Kubi that can stream medical data.
This compliance is a big deal: no longer do MDS have to rely only upon what they see,
they can use data streaming directly to the app to help make decisions. In smaller rural hospitals or even in ambulances, where a specialist cannot be physically there,
HIPAA compliance means that medical data can be streamed. A simple MD telepresence consult, which is similar to a telephone call,
But once you start implementing data, you enter into a new realm or regulations. While the RP-Vita by irobot/intouch Health has HIPAA compliance as well,
and analyze the data. That why I so excited by this. hile this initial result is essentially a proof of concept rather than a practical system,
and their overall business. ade possible by the digital economy, forward-thinking businesses are choosing to embrace this value to intentionally reimagine the economy around how we use resources,
In his blog 99 Mind-blowing Ways the Digital economy Is Changing the Future of Business, Vivek Bapat revealed that 68%of consumers are interested in companies that bring social and environmental change.
Conversely, when data shows that students aren't using specific videos at all, or if students give feedback that a given topic isn't pertinent to the work environment,
which could potentially deliver data at rates that are up to 100 times faster than current wireless technology.
Multiplexing is the process of separating streams of data that are traveling through a single medium
"Each frequency can carry a separate data stream, and the data receiver at the other end could pick up an individual stream by accepting radiation at a specific angle.
The researchers published a paper, "Frequency-Division Multiplexing in the Terahertz Range Using a Leaky-Wave Antenna,"in Nature Photonics on September 14.
they used the Southern African Large telescope to gather data to help determine the size of the black hole.
The data indicated the black hole is 30 times larger than expected for this size of galaxy, according to a press release from the Royal Astronomical Society."
Very few people have data supporting it, says Judy Van de Water, an internal medicine specialist at the University of California,
Davis, who did not take part in the research. his study getting data that can begin addressing the connection more directly.
#Light-based memory chip is first to permanently store data Today electronic computer chips work at blazing speeds.
and moves data with photons of light instead of electrons would make today chips look like proverbial horses and buggies.
When electrons move through the basic parts of a computer chipogic circuits that manipulate data,
can store data only if they have a steady supply of power. When the power is turned off,
the data disappear, too. Now, researchers led by Harish Bhaskaran, a nanoengineering expert at the University of Oxford in the United kingdom,
and CDS and DVDS use this difference to store data. To read out the datatored as patterns of tiny spots with a crystalline
whether they could use this property to permanently store data on a chip and later read it Out to do used so,
To write data in this layer, the scientists piped an intense pulse of light into the waveguide.
When the researchers wanted to read the data, they beamed in less intense pulses of light
they knew their data spot on the GST had an amorphous order; if more was absorbed, that meant it was crystalline.
and their colleagues also took steps to dramatically increase the amount of data they could store
and read multiple bits of data simultaneously, something you can do with electrical data storage devices. And, as they report this week in Nature Photonics,
by varying the intensity of their data-writing pulses, they were also able to control how much of each GST patch turned crystalline or amorphous at any one time.
With this method they could make one patch 90%amorphous but just 10%crystalline, and another 80%amorphous and 20%crystalline.
That made it possible to store data in eight different such combinations, not just the usual binary 1s and 0s that would be used for 100%amorphous or crystalline spots.
This dramatically boosts the amount of data each spot can store, Bhaskaran says. Photonic memories still have a long way to go
and the data are exciting, but we will need to move carefully, says Steven Deeks, an HIV/AIDS clinician at the University of California, San francisco (UCSF),
But perhaps more important than just a way to control a robot in the absence of knowing how to do it autonomously is being able to observe and collect data from the robot.
Given hours of data recording the details of human strategies for balance and pose adjustment,
(I/O)- intensive applications such as server farms is required the energy consumption to transport bits of data around.
Into the microprocessor foundryadvances in microprocessor performance increasingly are limited by the ability to feed data into the microprocessor
and the energy cost of getting the data, says Rajeev Ram, professor of electrical engineering at MIT.
Applications range from data processing and communications to sensors on a chip n
#A new molecular design approach For decades, materials scientists have worked to infuse the lessons learned from natural proteins into the design of new materials.
From these configurations, the program creates Protein Data Bank (PDB) files to be passed to molecular dynamics software.
Ultimately, they hope to create an extensible database of structures for engineers to estimate the final configuration that a new material
The self learning database, after running a stream of simulations, will record the protein preferred conformations and store the final structures. ith this program,
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011