Transistors, which form the basis of today's computing, are tiny devices that stop the flow of electric current (off and on,
"##The research team included faculty members in bioengineering, chemical and biomolecular engineering, chemistry, electrical and computer engineering and mechanical science and engineering;
000 Qubit Processor and Is discussed in the Economist June 23rd, 2015leti to Present Solutions to New Applications Using 3d Technologies at SEMICON West Letiday Event, July 14:
The use of soft, stretchable material would enable a new generation of wearable devices that fit themselves to the human body.
"Our team aims to develop comfortable wearable devices. This ink was developed as part of this endeavor, "says Someya."
Researchers in UCSB's Department of Electrical and Computer engineering are seeking to make computer brains smarter by making them more like our own May 11th, 2015making robots more human April 29th, 2015lifeboat Foundation launches Interactive Friendly AI April 6th,
In a paper published July 10 in the journal Physical Review Letters, Zongfu Yu, an assistant professor of electrical and computer engineering,
'"Halas, Rice's Stanley C. Moore Professor of Electrical and Computer engineering and professor of chemistry, bioengineering, physics and astronomy,
an associate professor of electrical and computer engineering at NC State and corresponding author of a paper describing the work.
Electrical and computer engineering associate professor Rajesh Menon and colleagues describe their invention today in the journal Nature Photonics Silicon photonics could significantly increase the power and speed of machines such as supercomputers
"With all light, computing can eventually be millions of times faster, "says Menon. To help do that, the U engineers created a much smaller form of a polarization beamsplitter
Thanks to a new algorithm for designing the splitter, Menon's team has shrunk it to 2. 4 by 2. 4 microns,
"says study co-lead author Yu-Chih Chen, a postdoctoral researcher in Electrical engineering and Computer science at the University of Michigan College of Engineering.
and biology,"says study co-senior author Euisik Yoon, Ph d.,professor of electrical engineering and computer science and of biomedical engineering and director of the Lurie Nanofabrication Facility at the U-M College of Engineering."
Many researchers see improved interconnection of optical and electronic components as a path to more efficient computation and imaging systems.
which has limited options for B-cell-based vaccine programming. Using Cellsqueeze circumvents this problem and by being able to separately configure delivery and activation,
but the algorithms that handle sound and image processing are inspired by biology, says Professor yvind Brandtsegg at NTNU.
the cornerstone of genetic programming The transistor is the central component of modern electronic systems. It acts both as a switch and as a signal amplifier.
Understanding the effects that these ultra-intense x-ray pulses will have on their potential targets will take the team work of Argonne National Laboratorys Advanced Photon Source (APS) and the Argonne Leadership Computing Facility (ALCF), both
The team uses a hybrid code employing both molecular dynamics (MD) and Monte carlo (MC) algorithms.
the MC algorithm uses a pre-computed database to update and track the electronic configuration of every particle interacting with an x-ray pulse.
The new algorithm, published in Nature Methods("Efficient set tests for the genetic analysis of correlated traits),
until now so much computation that it would take a year to run a single complex query."
The researchers tested their algorithm on data from two studies from public repositories, and compared the results with existing state-of-the-art tools.
"Our algorithm can be used to study up to half a million individuals-that hasn't been possible until now.""
The new algorithm provides much-needed methods for genomics, making large-scale, complex analysis a manageable and practical endeavour."
Computations made by the group of Professor Thomas Heine from Jacobs University Bremen, which is involved also in the project,
Jaeyoun (Jay) Kim, an Iowa State university associate professor of electrical and computer engineering and an associate of the U s. Department of energy's Ames Laboratory."
and computer engineering and is moving to postdoctoral work at the University of Pennsylvania in Philadelphia. The paper describes how the engineers fabricated microtubes just 8 millimeters long and less than a hundredth of an inch wide.
This material class therefore has enormous potential for future applications in information technology. n
#Sweeping lasers snap together nanoscale geometric grids Down at the nanoscale, where objects span just billionths of a meter,
#Toward tiny, solar-powered sensors The latest buzz in the information technology industry regards he Internet of thingsthe idea that vehicles, appliances, civil-engineering structures, manufacturing equipment,
an MIT graduate student in electrical engineering and computer science and first author on the new paper. e need to regulate the input to extract the maximum power,
They are used also in flashes in mobile phones and as a complementary technology to batteries in order to boost performance.
"said Boubacar Kant, a professor in the Department of Electrical and Computer engineering at the UC San diego Jacobs School of engineering and the senior author of the study."
In a paper published July 10 in the journal Physical Review Letters("Extraordinarily large optical cross section for localized single nanoresonator"),Zongfu Yu, an assistant professor of electrical and computer engineering,
Graph-theoretic algorithms and optimization techniques are used then to calculate the DNA sequences needed to produce the structure.
Advanced computing methods are likely to be a key enabler in the scaling of DNA NANOTECHNOLOGY from fundamental studies towards groundbreaking applications,
New ways of generating spin currents may be important for low-power high-speed spin based computing (spintronics),
"said Maiken Mikkelsen, an assistant professor of electrical and computer engineering and physics at Duke.""We can now start to think about making fast-switching devices based on this research,
working with a team of researchers led by Alexandra Boltasseva, an associate professor of electrical and computer engineering,
and Vladimir M. Shalaev, scientific director of nanophotonics at Purdue's Birck Nanotechnology Center and a distinguished professor of electrical and computer engineering."
'dsq. type='text/javascript';'dsq. async=true; dsq. src='/disqus shortname+'.+'disqus. com/embed. js';('document. getelementsbytagname('head')0 document. getelementsbytagname('body')0). appendchild (dsq;(;
Please enable Javascript to view the comments powered by Disqus. comments powered by Disqus Take Action:
Lobo and Levin developed an algorithm that would use evolutionary computation to produce regulatory networks able to volveto accurately predict the results of published laboratory experiments that the researchers entered into a database. ur goal was to identify a regulatory network that could be executed in every cell
Tufts biologists devloped an algorithm that used evolutionary computation to produce regulatory networks able to volveto accurately predict the results of published research on planarian regeneration.
The algorithm compared the resulting shape from the simulation with real published data in the database.
First Regenerative Model Discovered by Artificial intelligence The researchers ultimately applied the algorithm to a combined experimental dataset of 16 key planarian regeneration experiments to determine
After 42 hours, the algorithm returned the discovered regulatory network, which correctly predicted all 16 experiments in the dataset.
Lobo and Levin are trained both in computer science and bring an unusual perspective to the field of developmental biology.
Levin majored in computer science and biology at Tufts before earning his Ph d. in genetics. Lobo earned a Ph d. in the field before joining the Levin lab. The paper represents a successful application of the growing field of obot sciencewhich Levin says can help human researchers by doing much more than crunch enormous datasets quickly. hile
the artificial intelligence in this project did have to do a whole lot of computations, the outcome is a theory of
the team devised a computer algorithm to process OCT data and, nearly instantaneously, generate a color-coded map with cancer in red and healthy tissue in green. e envision that the OCT would be aimed at the area being operated on,
with miniaturised electronics that can use algorithms to recognise touches or swipes, ATAP says. The data can be sent wirelessly to smartphones or other devices,
Founder Bob Roohparvar, a computer science professor at California State university, likened the technology a tube of toothpaste. f you just squeeze from the top,
Program director for the US NAVY Captain Jeff Dodge likened the upgrade from the MQ-8b based on a smaller airframe to the model aircraft to a brain transplant. e are taking the computer
a professor in the Department of Electrical and Computer engineering and the senior author on the Science paper. ur approach conditions the information before it is sent even,
#Researchers develop new Algorithm to empower Robots to Learn like Humans New algorithms enable robots to learn motor tasks through trial and error, like humans.
The algorithm gives rise to a new branch of artificial intelligence, known as deep learning. The researchers chose Berkeley Robot for the Elimination of Tedious Tasks (BRETT) to take up a challenge of dealing with a relatively promising form of artificial intelligence called deep structured learning.
The researchers have claimed that smaller amount of pre-programming is required when the algorithm is used in the robot.
Also, it provides the capacity to work outside controlled environments like medical centers factories or laboratories.
an associate professor in the campus electrical engineering and computer sciences department, developed the new algorithm. Abbeel said the best thing about the technique is that it rids the need of reprogramming
Use of the algorithm is seen currently in voice recognition software, such as the iphone's Siri
#New Algorithm enables Robot to Learn through Trial and error UC Berkeley's BRETT (Berkeley Robot for the Elimination of Tedious Tasks) is capable of learning through trial and error, like humans.
New algorithms developed by researchers empower the robot to master tasks through trial and error, ridding the need of pre-programming.
Among many tasks, it can perform is assembling a toy, and the best thing is it keeps trying figuring out the way to accomplish the task until it finally done.
New algorithms developed by researchers from UC Berkeley brought this trial and error process to robots. UC Berkeley said in a press release that the technology is a giant leap in the field of artificial intelligence.
For more information, visit www1. lehigh. edu. Harsh Environments No Match for New Fiber Sensor Nanofiber Fabrication Boosts Quantum computing Sulfur Copolymers Boost IR Optics
After identifying brain cancer's OCT signature, researchers at Johns hopkins university have developed a computer algorithm that rapidly generates a color-coded map that shows cancer in red and healthy tissue in green."
Taking inspiration from modern computing methods, Orth and colleagues at Harvard and Thermo Fisher Scientific Inc. of Pittsburgh worked to overcome the limitations imposed by current multispectral microscopes.
with algorithms and sensors that automatically adjust the angle of the foot during different points in its wearer's stride.
Similar to the movie, researchers at MIT Computer science and Artificial intelligence Laboratory have created an object recognition system that can accurately identify objects using a normal RGB camera (no threatening blood-red color filter required.
The algorithm will improve over time, Amazon told technology site Cnet. Its first effects may not be visible for some time as the work only began on Friday.
algorithms that take into account nearby faults and other factors to broadcast a warning of affected areas, estimated intensities and apparent epicenter.
The sensors use a patented algorithm to filter the signals it picks up into two heartbeat recordings.
using algorithms that make the arm easier to use.""A good example is had we actually an amputee use the wireless brainwave headset to control a hand,
ADVERTISING OPM detected new malicious activity affecting its information systems in April and the Department of Homeland Security said it concluded at the beginning of May that the agency's data had been compromised.
Lobo and Levin developed an algorithm that would use evolutionary computation to produce regulatory networks able to"evolve"to accurately predict the results of published laboratory experiments that the researchers entered into a database."
The algorithm compared the resulting shape from the simulation with real published data in the database.
First Regenerative Model Discovered by Artificial intelligence The researchers ultimately applied the algorithm to a combined experimental dataset of 16 key planarian regeneration experiments to determine
After 42 hours, the algorithm returned the discovered regulatory network, which correctly predicted all 16 experiments in the dataset.
Lobo and Levin are trained both in computer science and bring an unusual perspective to the field of developmental biology.
Levin majored in computer science and biology at Tufts before earning his Ph d. in genetics. Lobo earned a Ph d. in the field before joining the Levin lab. The paper represents a successful application of the growing field of"robot science
"While the artificial intelligence in this project did have to do a whole lot of computations, the outcome is a theory of
Computation used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by NSF grant OCI-1053575,
Computations made by the group of Professor Thomas Heine from Jacobs University Bremen, which is involved also in the project,
They can even perform computations based on changes in the environment or movement, and respond to human vital signs.
and Steven P. Levitan, Ph d.,John A. Jurenko professor of electrical and computer engineering, integrated models for self-oscillating polymer gels and piezoelectric micro-electric-mechanical systems to devise a new
reactive material system capable of performing computations without external energy inputs, amplification or computer mediation. The studies combine Balazs'research in Belousov-Zhabotinsky (BZ) gels, a substance that oscillates in the absence of external stimuli,
"allowing the material to be used for computation. Levitan adds, however, the computations would not be general purpose,
but rather specific to pattern-matching and recognition, or other non-Boolean operations.""Imagine a group of organ pipes,
and respond accordingly, thereby performing the actual computing.""Developing so-called"materials that compute"addresses limitations inherent to the systems currently used by researchers to perform either chemical computing or oscillator-based computing.
Chemical computing systems are limited by both the lack of an internal power system and the rate of diffusion as the chemical waves spread throughout the system,
Further, oscillator-based computing has not been translated into a potentially wearable material. The hybrid BZ-PZ model,
"With all light, computing can eventually be millions of times faster, "Menon said. Menon and his team figured out how to take current beamsplitters,
by developing an inverse design algorithm that tells them exactly how to build the silicon structures they need to perform a desired task.
Theye already used the algorithm to design a working optical circuit, and have made several copies in their lab. Reporting in Nature Photonics,
who worked on the algorithm.""The fact that we could build devices this robust on our equipment tells us that this technology will be easy to mass-produce at state-of-the-art facilities."
By designing very precise segments of silicon and pairing them together-according to the instructions of the algorithm-the team are able to create switches or conduits that control the flow of photons,
By creating an algorithm that automates the development of these complex Swiss cheese silicon structures, the team has essentially"set the stage for the next generation of even faster
The algorithm could also be used to find design solutions to many other communication problems-all a researcher needs to do is plug in their desired result,
and the algorithm will come up with a plan. We're pretty excited to see what they do with it next h
"Imagine if when you run a set of computations that not only information is processed but physical matter is manipulated algorithmically as well.
And it carries information using the absence of presence of water droplets as the 1s and 0s of its binary code."
and play around with the design to help take it to the next level. e are trying to bring the same kind of exponential scale up because of computation we saw in the digital world into the physical world,
By developing cloud databases and algorithms to store all of this data, the researchers behind Project Premonition hope to build a robust system capable of spotting dangers to humans and wildlife alike in the future u
which involves converting data into a set of algorithms that a computer can make sense of.
Using an algorithm, they worked out how the words are clustered, and they then looked up the different definitions of each word in a dictionary.
"This material class therefore has enormous potential for future applications in information technology
#Here's how to make carbon nanoparticles with honey and a microwave Carbon nanoparticles can be incredibly useful in the treatment of many types of disease,
It states that computing power has the potential to double every two years, and so far, it's held true.
000 times faster than current flash memory Tech giants Intel and Micron have announced a new class of computer memory called 3d XPOINT,
It also 10 times denser than current flash, meaning it could lead to smaller components and ultimately even smaller devices.
so that different laser flashes are sent into different directions says Ulrich Schmid. To experience the 3d effect the viewer must be positioned in a certain distance range from the screen.
Additional authors on the ACS Nano paper include UW-Madison materials science and engineering graduate students Gerald Brady Yongho Joo and Matthew Shea and electrical and computer engineering graduate student Meng-Yin
Capacitors use an electrostatic charge to store energy they can release quickly, to a camera's flash, for example.
and understands it said Yiannis Aloimonos UMD professor of computer science and director of the Computer Vision Lab one of 16 labs and centers in UMIACS.
while for computing technology to catch up. Similar versions of neural networks are responsible for the voice recognition capabilities in smartphones
#Vision system for household robots Researchers at MIT's Computer science and Artificial intelligence Laboratory believe that household robots should take advantage of their mobility
In a paper appearing in a forthcoming issue of the International Journal of Robotics Research the MIT researchers show that a system using an off-the-shelf algorithm to aggregate different perspectives can recognize four times as many objects as one that uses a single
They then present a new algorithm that is just as accurate but that in some cases is 10 times as fast making it much more practical for real-time deployment with household robots.
and computer science and lead author on the new paper. One way around that is just to move around
Wong and his thesis advisors--Leslie Kaelbling the Panasonic Professor of Computer science and Engineering and Toms Lozano-Prez the School of engineering Professor of Teaching Excellence--considered scenarios in which they had 20 to 30
The first algorithm they tried was developed for tracking systems such as radar which must also determine
For each pair of successive images the algorithm generates multiple hypotheses about which objects in one correspond to which objects in the other.
To keep the calculation manageable the algorithm discards all but its top hypotheses at each step.
In hopes of arriving at a more efficient algorithm the MIT researchers adopted a different approach.
Their algorithm doesn't discard any of the hypotheses it generates across successive images but it doesn't attempt to canvass them all either.
Suppose that the algorithm has identified three objects from one perspective and four from another. The most mathematically precise way to compare hypotheses would be to consider every possible set of matches between the two groups of objects:
Instead the researchers'algorithm considers each object in the first group separately and evaluates its likelihood of mapping onto an object in the second group.
The algorithm could conclude that the most likely match for object 3 in the second group is object 3 in the first
So the researchers'algorithm also looks for such double mappings and reevaluates them. That takes extra time
In this case the algorithm would perform 32 comparisons--more than 20 but significantly less than 304 4
and then bring them together explains Faraz Najafi a graduate student in electrical engineering and computer science at MIT and first author on the new paper.
A computational element made from such a particle--known as a quantum bit or qubit--could thus represent zero and one simultaneously.
If multiple qubits are entangled meaning that their quantum states depend on each other then a single quantum computation is in some sense like performing many computations in parallel.
With most particles entanglement is difficult to maintain but it's relatively easy with photons.
For that reason optical systems are a promising approach to quantum computation. But any quantum computer--say one whose qubits are trapped laser ions
or nitrogen atoms embedded in diamond--would still benefit from using entangled photons to move quantum information around.
or hundreds of photonic qubits it becomes unwieldy to do this using traditional optical components says Dirk Englund the Jamieson Career development Assistant professor in Electrical engineering and Computer science at MIT and corresponding author on the new paper.
which is led by Karl Berggren an associate professor of electrical engineering and computer science and of which Najafi is a member.
and an affiliate of the Beckman Institute for Advanced Science and Technology at Illinois. He also holds affiliate appointments in the departments of bioengineering, chemistry, electrical and computer engineering,
and algorithm programming. I don't think there are many places in the world where one finds the level of interdisciplinary cooperation that exists in our Center for Neuroprosthetics."
#New algorithm will allow better heart surgery experts say A new technique to help surgeons find the exact location of heart defects could save lives,
Now the team at Manchester have come up with a new algorithm which will enable medics to exactly find the area of concern before any surgery takes place.
the algorithm will detect the origin of the heart defect, cutting the amount of time in surgery for some patients.
Professor Henggui Zhang describes how the new algorithm had a success rate of 94%.%Using 3d computer modelling of the human heart,
Using this new algorithm ECG map can help diagnose the location of cardiac disorder in a way which is better for the patients and more cost effective for health services
#Computing: Common'data structure'revamped to work with multicore chips Today hardware manufacturers are making computer chips faster by giving them more cores or processing units.
But while some data structures are adapted well to multicore computing others are not. In principle doubling the number of cores should double the efficiency of a computation.
With algorithms that use a common data structure called a priority queue that's been true for up to about eight cores
--but adding any more cores actually causes performance to plummet. At the Association for Computing Machinery's Symposium on Principles and Practice of Parallel Programming in February researchers from MIT's Computer science and Artificial intelligence Laboratory will describe a new way of implementing priority queues that lets them keep pace with the addition of new cores.
In simulations algorithms using their data structure continued to demonstrate performance improvement with the addition of new cores up to a total of 80 cores.
A priority queue is a data structure that as its name might suggest sequences data items according to priorities assigned them
Priority queues are central to the standard algorithms for finding the shortest path across a network
and computer science and one of the new paper's co-authors. All of these guys try to put the first element in their cache
their advisor professor of computer science and engineering Nir Shavit; and Microsoft Research's Dan Alistarh a former student of Shavit's relaxed the requirement that each core has to access the first item in the queue.
--which must be the case for multicore computing to work anyway--they can simply be assigned to cores at random.
But the MIT researchers'algorithm starts farther down the hierarchy; how far down depends on how many cores are trying to access the root list.
One such piece of evidence comes from an observation surrounding fetal programming, says Rosen.""Fetal programming centers on a person's exposure in utero,
"he explains.""So, for example, whether a fetus has received too few or too many nutrients from the mother can lead to a person becoming obese or diabetic in adulthood,
When the researchers treated live bacteria with the new drug two of the genetic changes actually arose just as their algorithm predicted.
This gives us a window into the future to see what bacteria will do to evade drugs that we design before a drug is deployed said co-author Bruce Donald a professor of computer science and biochemistry at Duke.
and Amy Anderson at the University of Connecticut used a protein design algorithm they developed called OSPREY to identify DNA sequence changes in the bacteria that would enable the resulting protein to block the drug from binding
The researchers are now using their algorithm to predict resistance mutations to other drugs designed to combat pathogens like E coli and Enterococcus.
Some day such buffers could be incorporated in quantum computers. While it is known already that the slow and fast light can be obtained using Brillouin scattering our device is far smaller
or computer code to make things living cells from skin muscle or cartilage are the raw material.
This reduces the need for trial and error experimentation in the lab. Using a supercomputer at Argonne National Laboratory we are able to use our computer simulations to compress decades of research in the lab into a total of about a day's worth of computing said lead researcher Ilja
Predicting the zeolites'performance required serious computing power efficient computer algorithms and accurate descriptions of the molecular interactions.
The team's software can utilize Mira a supercomputer with nearly 800000 processors to run in a day the equivalent computations requiring about 10 million hours on a single-processor computer.
The computations identified zeolites to attack two complex problems. The first problem researchers tackled is the current multi-step ethanol purification process encountered in biofuel production.
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011