Education and Outreach Blog

« Back

HPC Focus on Research for the Week of December 24, 2012 Sponsored by XSEDE

Computational Research Features from Across the Country and Around the World –  Highlights of 2012 from XSEDE Partners and Collaborators

Climate Science Triggers Torrent of Big Data Challenges for ORNL Researchers

Oak Ridge National Laboratory (ORNL) supercomputers running models to assess climate change ramifications and mitigation tactics are rapidly generating a wide variety of big data in vast volumes. ORNL's Galen Shipman says climate researchers have significantly boosted the temporal and spatial resolution of climate models as well as their physical and biogeochemical complexity, contributing to the amount of data produced by the models. To read further, please visit http://www.hpcwire.com/hpcwire/2012-08-15/climate_science_triggers_torrent_of_big_data_challenges.html.

Harvard Medical School Researchers Write Book Using DNA

Harvard Medical School researchers have encoded an entire book in DNA. The book includes more than 50,000 words, 11 images, and one computer program totaling about 0.7 megabytes of data. The researchers say DNA has unique advantages for data storage, such as improved data density and durability. They also note that DNA can survive for millennia undamaged, and the tools and technologies required for reading out the information will be available in future generations. The researchers divided the information in the book into pieces, and then synthesized each of the pieces into short DNA fragments of about 160 nucleotides. Each fragment carries part of the book, information about its position, as well as the parts necessary for reading and replicating the piece. To read further, please visit http://www.washingtonpost.com/national/health-science/researchers-write-book-using-dna/2012/08/19/9a84903c-e95b-11e1-a3d2-2a05679928ef_story.html.

The Big Apple's Big Data Advantage

Microsoft's new research lab in Manhattan will focus on big data analysis, examining massive amounts of information created by the world's digital users, says lab director Jennifer Chayes. She says the facility will study how big data can help answer social science and economic questions, and what it means for the interaction of the social sciences with technology. One project involves studying how people make bets, because if people place bets on certain things, they are usually more invested in that thing, which can be a very effective way of collecting data, Chayes notes. The lab also has researchers that are building Vowpal Wabbit, a machine-learning platform that provides a faster way to analyze huge data sets. Chayes says the lab has strong relationships with all of the major universities in the area, such as New York University, Columbia University, and Cornell University. New York City has adopted the nickname "Silicon Alley," and it is becoming a focal point for data-intensive startups in Web 2.0 and beyond, Chayes notes. To read further, please visit http://tech.fortune.cnn.com/2012/08/20/the-big-apples-big-data-advantage/.

As Smart Electric Grid Evolves, Virginia Tech Engineers Show How to Include Solar Technologies

An optimization algorithm could help ensure that solar technologies are integrated with existing technologies such as energy storage and control systems. Virginia Tech electrical engineers have developed an optimization algorithm for selling power back to the electrical distribution industry and storing electricity on a broad scale. "Withholding distributed photovoltaic power, probably gained from rooftop panels, represents a gaming method to realize higher revenues due to the time varying cost of electricity," says Virginia Tech's Reza Arghandeh. "The distributed photovoltaic power adoption can be controlled with the help of real-time electricity price and load profile." Arghandeh worked with professor Robert Broadwater on the distributed energy storage system computation. The discrete ascent optimal programming approach insures convergence of the various power systems after a finite number of computational iterations. To read further, please visit http://eng.vt.edu/news/smart-electric-grid-evolves-virginia-tech-engineers-show-how-include-solar-technologies.

NSF is Building an Army of You

National Science Foundation researchers have developed a smart, animated, digital double that can interact with other people via a screen when the user is not present. These autonomous identities are not duplicates of human beings, but rather simple and potentially useful personas that could take on difficult tasks, and perhaps even modify people's behavior. The digital double is one of several new autonomous avatar technologies that are currently being developed. For example, the Web site rep.licants.org enables users to create a social media self, which can take over Facebook and Twitter accounts when required. Meanwhile, MyCyberTwin enables users to create copies of themselves that can engage visitors in a text conversation, accompanied by a photo or cartoon representation. Northeastern University researchers are developing animated avatars of doctors and other health-care providers, because tests show that 70 percent of patients prefer talking to a virtual version of a nurse instead of a real one. To read further, please visit http://www.newscientist.com/article/mg21528771.200-digital-doppelgangers-building-an-army-of-you.html?full=true.

Toward a R&D Roadmap for Privacy

The Information Technology & Innovation Foundation recently released a report calling for a research and development (R&D) roadmap for privacy, as well as a companion Web site to enable researchers to collaborate on creating a privacy research agenda. "Effectively addressing privacy concerns ... will require a mix of new technologies and policies to ensure data is properly safeguarded and consumers are protected," and a roadmap will "help address consumer privacy concerns, better align R&D investments with strategic objectives, and enable more innovation," the report says. The report warns that "if privacy concerns are not adequately addressed, they may stall or disrupt the deployment of new technologies that offer many potential economic and quality-of-life benefits to consumers." In addition, the report notes that "advances in privacy research and technology could strengthen consumer trust and better protect consumer privacy while enabling continued innovation." To read further, please visit http://www.cccblog.org/2012/08/06/toward-a-rd-roadmap-for-privacy//

Data Supercell at Pittsburgh Supercomputing Center

The Pittsburgh Supercomputing Center (PSC) has developed and deployed a cost-effective, disk-based file repository and data-management system called the Data Supercell. This innovative technology, developed by a PSC team of scientists, provides major advantages over traditional tape-based archiving for large-scale datasets. The PSC team exploited increasing cost-effectiveness of commodity disk technologies, and adapted sophisticated PSC-developed file system software (called SLASH2) to create a new class of integrated storage services. A patent application is under review. The Data Supercell is intended especially to serve users of large scientific datasets, including users of XSEDE (the Extreme Science and Engineering Discovery Environment), the National Science Foundation cyberinfrastructure program, the world’s largest collection of integrated   digital resources and services. “The Data Supercell is a unique technology, building on the increasing cost-effectiveness of disk storage and the capabilities of PSC’s SLASH2 file system,” said Michael Levine and Ralph Roskies, PSC co-scientific directors. To read further, please visit http://www.psc.edu/index.php/newscenter/71-2012press/728-data-supercell-at-pittsburgh-supercomputing-center.

UC San Diego Team Aims to Broaden Researcher Access to Protein Simulation

University of California, San Diego researchers have developed graphics-processing unit (GPU)-accelerated software and demonstrated an approach that can sample biological events that occur on the millisecond timescale. The researchers combined an algorithm, an off-the-shelf GPU, and the Assisted Model Building with Energy Refinement software to run a biological simulation on the Anton supercomputer. "This work shows that using conventional, off-the-shelf GPU hardware combined with an enhanced sampling algorithm, events taking place on the millisecond time scale can be effectively sampled with dynamics simulations orders of magnitude shorter than those timescales," the researchers say. The enhanced sampling algorithm refers to accelerated molecular dynamics (aMD), which improves the conformational space sampling of proteins when compared with conventional molecular dynamics simulations (cMD). To read further, please visit http://ucsdnews.ucsd.edu/pressreleases/uc_san_diego_team_aims_to_broaden_researcher_access_to_protein_simulation#.UCUC36PLvKs.

NIST’s BIG DATA Workshop: Too Much Data, Not Enough Solutions

Argonne National Laboratory Computation Institute director Ian Foster gave a keynote speech at the U.S. National Institute of Standards and Technology's recent BIG DATA Workshop, an event that brought together experts from academia, industry, and government to study key topics in support of the federal government's Big Data R&D Initiative. Foster says researchers and institutions can meet the needs of big data by accelerating discovery and innovation worldwide with a research information technology (IT) as a service program. In addition, he says IT professionals can leverage the cloud to provide millions of researchers with access to powerful tools, enable a massive shortening of cycle times in the research process, and reduce the research IT needs. Meanwhile, the U.S. National Science Foundation's (NSF's) Howard Wactlar says a paradigm shift is currently taking place, from hypothesis-driven research to data-driven research. To read further, please visit http://www.cccblog.org/2012/06/21/nists-big-data-workshoptoo-much-data-not-enough-solutions/.

$27 Million Award Bolsters Open Science Grid

The U.S. Department of Energy's (DOE's) Office of Science and the U.S. National Science Foundation (NSF) recently committed up to $27 million to the Open Science Grid (OSG), a nine-member partnership to advance distributed high-throughput computing capabilities at more than 80 sites. "The commitment from the two agencies will take the capabilities and culture we've developed to more campuses throughout the United States," says OSG researcher and University of Wisconsin-Madison professor Miron Livny. "It is about advancing the state of the art to support education and research in more science domains and improve our ability to handle more data." The Office of Science will contribute up to $8.2 million for distributed computing efforts based at DOE national laboratories. The NSF will contribute the remaining balance of the funding, which will be used to promote distributed computing resources at U.S. universities. To read further, please visit http://www.news.wisc.edu/20800.

Lawrence Berkeley National Laboratory, UCSD and Los Alamos National Laboratory Researchers Design Strategies for Extracting Interesting Data from Massive Scientific Datasets

Researchers at Lawrence Berkeley National Laboratory, the University of California, San Diego (UCSD), Los Alamos National Laboratory, Tsinghua University, and Brown University have developed software strategies for storing, mining, and analyzing huge datasets that focus on data generated by VPIC, a state-of-the-art plasma physics code. When the researchers ran VPIC on the U.S. Department of Energy’s National Energy Research Scientific Computing Center’s supercomputer, they generated a three-dimensional (3D) magnetic reconnection dataset of one trillion particles. VPIC simulated the process in thousands of time-steps, periodically writing a 32-terabyte file to disk at specified times. The researchers applied an enhanced version of the FastQuery tool to index the massive dataset in about 10 minutes. "This is the first time anyone has ever queried and visualized 3D particle datasets of this size," says UCSD researcher Homa Karimabadi. To read further, please visit http://crd.lbl.gov/news-and-publications/news/2012/sifting-through-a-trillion-electrons/.

TACC Is Mapping the Future of Climate Change in Africa

Researchers at the Texas Advanced Computing Center (TACC) are working on the Climate Change and African Political Stability (CCAPS) program, which features an online mapping tool that analyzes how climate and other factors interact to threaten the security of African communities. "The first goal was to look at whether we could more effectively identify what were the causes and locations of vulnerability in Africa, not just climate, but other kinds of vulnerability," says University of Texas at Austin professor Francis J. Gavin. CCAPS consists of nine research teams focusing on different aspects of climate change, their relationship to different types of conflict, the government structures that exist to mitigate them, and the effectiveness of international aid in intervening. The researchers, led by University of Texas at Austin professor Joshua Busby, examined four different sources and then combined them to form a composite map.  To read further, please visit http://www.tacc.utexas.edu/news/feature-stories/2012/ccaps.

Campus Champion Michigan State University Relieves Logjams in Big Data Sets

Michigan State University (MSU) researchers have developed a computational technique that relieves logjams that commonly occur in big data sets. The researchers note that microbial communities' genomic data is easy to collect, but the data sets are so large that they can overwhelm conventional computers. "To thoroughly examine a gram of soil, we need to generate about 50 terabases of genomic sequence--about 1,000 times more data than generated for the initial human genome project," says MSU professor C. Titus Brown. He notes the strategy is unique in that it was created using small computers rather than supercomputers, which is the usual approach for bioinformatics research. The method utilizes a filter that folds the data set up using a special data structure, which enables computers to analyze small portions of the data at a time. To read further, please visit http://news.msu.edu/story/massive-data-for-miniscule-communities/

Argonne National Laboratory Replaces Supercomputer With Newer, Faster Model

Argonne National Laboratory recently started accepting applications from scientists that want to use its new Mira supercomputer, which is ranked the third fastest in the world and has 768,000 core processors and operates at more than eight petaflops. Mira's initial applications include studying the quantum mechanics of new materials, measuring the role and impact of clouds on climate, and modeling earthquakes. Those and 13 other projects are part of Argonne's Early Science Program and are intended advance science, as well as evaluate Mira's performance, according to Argonne's Mike Papka. "A new architecture with a new system software stack, and at a scale that is larger than anyone else has run previously, results in a system that will have issues never seen before," Papka says. To read further, please visit http://www.informationweek.com/government/enterprise-applications/national-lab-replaces-supercomputer-with/240004607.

European Union Offers Big Data at Your Service

The growing use of information and communication technology is generating vast volumes of structured and unstructured data that present an opportunity that European Union (EU) research initiatives are seeking to take advantage of by promoting open data. For example, the EU-funded Weknowit project, also called emerging, collective intelligence for personal, organizational, and social use, has devised a platform for converting unstructured user-produced content into a new collective intelligence with many uses. Project coordinator Yiannis Kompatsiaris says the platform includes "meaningful topics, entities, points of interest, social connections, and events." Projects that could benefit scientific research include the data infrastructures ecosystem for science effort, which has developed an interoperable framework to enable the sharing of different e-infrastructures' computing and software resources irrespective of location, format, technology, language, protocol, or workflow. To read further, please visit http://ec.europa.eu/information_society/newsroom/cf/dae/itemdetail.cfm?item_id=8337&utm_campaign=isp&utm_medium=rss&utm_source=newsroom&utm_content=type-news.

UCSD Seeks Citizen Scientists for Earth Shaking Science Project

Members of the public willing to help scientists capture key seismic data to improve scientific understanding of earthquakes, provide detailed information on how they shape Southern California and aid earthquake emergency response efforts. This call for help comes from members of the “Quake Catcher Network,” a collaborative project sponsored by the National Science Foundation in which earthquake scientists around Southern California enlist volunteers to deploy small, easy-to-install seismic sensors in their homes, offices and other locations that have a computer with Internet connectivity. The project is conducted by scientists at Scripps Institution of Oceanography at UC San Diego, California Institute of Technology, Stanford University, UC Berkeley, University of Delaware and the U.S. Geological Survey (USGS). Fore more information, please visit http://ucsdnews.ucsd.edu/features/citizen_scientists_sought_for_earth_shaking_science_project/.

University of California, Berkeley Hosts Forum on the Big Data Industry

Although "big data" has become a multibillion-dollar industry in less than 10 years, a lot of growth is still needed before the industry has proven standards. The big data industry also needs broad-based literacy, new kinds of management, better tools for reading the information, and privacy safeguards for corporate and personal information. Training people in how to take advantage of big data is another significant challenge. The University of California, Berkeley's iSchool recently hosted a forum on the big data industry and how these hurdles will be overcome in the future. For example, Cloudera says it is currently training 1,500 users a month on how to use the Hadoop database and associated applications. The wide variety of new sources of data has made data quality an issue, and the problem is exacerbated by the fact that companies are reluctant to make their data available in a commonly shared algorithm. To read further, please visit http://bits.blogs.nytimes.com/2012/06/04/how-big-data-gets-real/.

Coding Contest Shows How Big Data Can Improve Health Care

The recent Health 2.0 Boston Code-a-thon brought together information technology (IT) professionals, medical workers, and other experts with an interest in health IT to show how data analytics can improve health care. The competition featured about 85 participants who formed groups to create an application that turns health care data into useful information for patients and care providers. The winning team created No Sleep Kills, a Web site that enables users to access information on how poor sleeping patterns can lead to car accidents. "The whole goal of getting more health data digital is so you can start doing meaningful things with data," says Guy Shechter, who helped develop No Sleep Kills. The site takes information from several sources, including publicly available data from the U.S. government. Event coordinator Deb Linton says the team won first place because it adhered to the competition's theme of using big data by incorporating multiple data sources. Health 2.0 co-chairman Matthew Holt notes that technology can only reach patients and caregivers if the tech community works within the health care system. To read further, please visit http://www.computerworld.com/s/article/9227493/Coding_contest_shows_how_big_data_can_improve_health_care.

Massachusetts Offers a New Model for Academic HPC

Several universities in Massachusetts will share high-performance computing (HPC) resources in a unique facility model by the end of the year. The Massachusetts Green High-Performance Computing Center (MGHPCC) will feature terascale hardware and the necessary infrastructure to enable its users to remotely access computing resources, including power, network, and cooling systems. University members include the Massachusetts Institute of Technology, Harvard University, Boston University, Northeastern University, and the University of Massachusetts system, and they must develop new strategies of implementation. The universities will provide their own hardware and migrate research to the $95 million center. MGHPCC executive director John Goodhue says the challenge will be to make the physical hardware act as a set of private local machines for the various users. To read further, please visit http://www.hpcwire.com/hpcwire/2012-05-29/massachusetts_offers_a_new_model_for_academic_hpc.html.

Exascale: Raising the Stakes
Excerpt from HPC Projects

Following the progress on the road to exascale is an interesting and entertaining exercise as more players get involved, predictions get more aggressive and politics and national pride take charge. And of course, the hype meter is overworked as more and more companies claim to have the answers. Funds committed to reaching exascale seem to be growing by the week. But let’s not forget – funding commitments don’t always result in checks being written. It is still far too early to determine who will demonstrate the financial staying power necessary to bring the first exascale-class system to market. The cultures and government infrastructures of China and Japan represent incredibly powerful forces capable of aligning resources, financial and other, to hold a steady course over the next eight years (approximate time for the arrival of exascale). Neither the spirit of commitment attached to national pride, nor the capability of technological innovation from these countries should be taken lightly. They are indeed the front runners for anyone betting on this race. As far back as June of 2010, we saw this quote in The Exascale Report from an anonymous contributor in China labeled Mr Zheng: ‘I think it would be great if the first exascale computer had a very large engraved tag that said, “Made in China”.’ To read further, http://www.hpcprojects.com/news/news_story.php?news_id=1674&goback=.gde_4178444_member_108471597.

University of Pennsylvania Researchers Tune In to the Internet Buzz

University of Pennsylvania researchers recently launched the Mining Internet Messages for Evidence of Herbal-Associated Adverse Events (MICE) project, which involves mining message boards and Twitter feeds to see what breast and prostate cancer patients are saying about herbal and nutritional supplements as treatments. Even if there is no scientific evidence to support what people post, it is useful to identify areas that would merit further study, says University of Pennsylvania researcher John Holmes. The Internet can be a great source for information epidemiology and "we can learn a lot about public sentiment, public attitudes, and public knowledge," agrees University of Toronto professor Gunther Eysenbach. However, analyzing Web conversations raises some ethical and privacy issues. MICE researchers only mine discussion sites that require participants to register and explicitly state in their terms of use that any information posted will become public. To read further, please visit http://online.wsj.com/article/SB10001424052702303404704577309794125038010.html.

Face Recognition Could Catch Bad Avatars According to University of Louisville Researchers

University of Louisville researchers are developing the field of artificial biometrics, known as artimetrics, to serve as a way to authenticate and identify non-biological agents such as avatars, physical robots, and chatbots. The researchers, led by Roman Yampolskiy, have developed facial recognition techniques specifically designed for avatars. "Not all avatars are human looking, and even with those that are humanoid there is a huge diversity of color," Yampolskiy says. Therefore, the software uses a large variety of colors to improve the recognition of avatars. The researchers also are studying how to match a human face to an avatar generated from that face. Combining the color-based technique with existing facial recognition software led to the best results, suggesting that it could be possible to track users between the physical and virtual worlds. To read further, please visit http://www.newscientist.com/article/mg21428596.100-face-recognition-could-catch-bad-avatars.html.  

Lehigh University Researchers Move Toward a Modular Defense Against Hackers

Lehigh University professor Gang Tan has developed automated techniques to scan for errors in large software systems. Tan and Lehigh researchers also recently received a five-year CAREER Award from the U.S. National Science Foundation to study and develop modular software that is less vulnerable to system-wide attacks by hackers. The researchers want to apply the principle of least privilege to software systems. "The principle of least privilege is like the separation of powers in a political system," Tan notes. He says the researchers have made progress in privilege separation in software environments, but challenges remain with operating system portability, high runtime overhead, architectural flexibility, and compositional reasoning. “These new tools and methodologies will make the principle of least principle easier to apply to big software systems," Tan says. To read further, please visit http://www4.lehigh.edu/news/newsarticle.aspx?Channel=/Channels/News:+2012&WorkflowItemID=519ce182-445e-4d8a-a837-6945492cdf08.

Research Affiliate with SDSC Urges “Big Data” Donations of Health Data

UC San Diego Biomedical Informatics chief Lucila Ohno-Machado proposes that patients could donate data like they donate tissue or organs, with informed consent. The researcher affiliated with Calit2 and SDSC was speaking at a Big Data conference in D.C., on a panel moderated by New York Times' Steve Lohr. Machado is the Founding Chief of UC San Diego’s Division of Biomedical Informatics. Director of the Biomedical Research Informatics for Global Health Program, and Editor in Chief of the Journal of the American Medical Informatics Association. Dr. Ohno-Machado noted how patients today give a tremendous amount of data to health care providers and hospitals, data that could be of great use to researchers if assembled together, such as solving the puzzle behind autism. She proposed that patients could donate data like they donate tissue or organs, with informed consent, to researchers. If properly handled and analyzed, such data can bring great advances in knowledge without violating privacy, but, like the other panelists, she stressed the need for more multidisciplinary data scientists in her field. For more information on this panel discussion, please visit http://www.govloop.com/profiles/blogs/challenges-and-opportunities-in-big-data-from-industry-and-1.

NCSA's Forge to be Decommissioned in September 2012

NCSA's Dell/NVIDIA cluster, Forge, will be decommissioned due to lack of ongoing support. Support for Forge has come primarily from the National Science Foundation, but there are also allocations supported by the University of Illinois, NCSA's Private Sector Program, and the NCSA Director's Office. The batch queues on Forge will be set to drain by 5 pm Sept. 28. After this time, batch jobs that remain in the queues will not be executed. User access to the Forge login nodes will be available until Sept. 30, 2012 to allow for data retrieval. Data remaining in the scratch, projects, and home directories after Sept. 30, 2012 will be deleted. To read further, please visit http://www.ncsa.illinois.edu/News/12/0410NCSAForge.html.

PSC's Blacklight to Leverage Shared-Memory System for Novel and Innovative Projects initiative 

Times are changing for HPC (high-performance computing) research, as non-traditional fields of study have begun taking advantage of powerful HPC tools. This was part of the plan when the National Science Foundation’s XSEDE (Extreme Science and Engineering Discovery Environment) program launched in July 2011. In recent months, the program took big steps toward this objective, in that a number of non-traditional projects — the common denominator being the need to process and analyze large amounts of data — were awarded peer-reviewed allocations of time on XSEDE resources. To read further, please http://www.hpcwire.com/hpcwire/2012-03-30/xsede_allocating_time_to_hpc_projects_with_shared_memory.html. visit

UCSD Engineers Test Life Saving Technology in a Seismic Stress Test

What happens when you put a fully equipped five-story building, which includes two hospital floors, computer servers, fire barriers and even a working elevator, through a series of high-intensity earthquakes? Structural engineers at the University of California, San Diego began to get some answers last week, when they launched a series of tests conducted on the world’s largest outdoor shake table at the Englekirk Structural Engineering Center. The overarching goal of the $5 million project, which is supported by a coalition of government agencies, foundations and industry partners, is to ascertain what needs to be done to make sure that high-value buildings, such as hospitals and data centers, remain operational after going through an earthquake. Researchers also will assess whether the building’s fire barriers have been affected by the shakes. To read further, please visit http://ucsdnews.ucsd.edu/features/seismic_stress_test/

TACC Releases New Software to Drive Large-Scaled Tile Displays

The Texas Advanced Computing Center (TACC) at The University of Texas at Austin has released a new open-source software package called DisplayCluster that is used to drive large-scale tiled displays and allows scientists to interact and view high-resolution imagery and video up to gigapixels in size. Large-scale tiled displays are increasingly used by scientific communities for their effectiveness in visualizing immense data sets. Whether it is a microbiology professor teaching about viruses in the bloodstream, a doctor reviewing high-resolution medical scans, or an art student studying the brush strokes in Van Gogh's Starry Night—all of these domains use large, high-resolution displays to visualize information that may not be apparent on a smaller, lower-resolution screen, thereby improving comprehension. To read further, please visit http://www.tacc.utexas.edu/news/press-releases/2012/tacc-releases-new-software.

UCSD's Center for Design and Geopolitics: The Art & Theory of Planetary-Scale Computation

Take a look at a map depicting global computer networks and two things become immediately apparent: the vast number of connections between servers across the planet, and the way those connections overlap geopolitical boundaries.  Take a look at a map depicting global computer networks and two things become immediately apparent: the vast number of connections between servers across the planet, and the way those connections overlap geopolitical boundaries. To read further, please visit http://www.calit2.net/newsroom/article.php?id=1948.

Colleges Looking Beyond the Lecture

Science, technology, engineering, and math departments at many universities are redesigning the lecture as a style of teaching out of concern that it is driving students away. Initiatives at American, Catholic, and George Washington universities and across the University System of Maryland are dividing 200-student lectures into 50-student studios and 20-student seminars. Faculty also are learning to make courses more active by asking more questions, starting ask-your-neighbor discussions, and conducting instant surveys. "We need to think about what happens when students have a bad experience with the course work," says University of Maryland of Baltimore County president Freeman Hrabowski. The lecture backlash signals an evolving vision of college as a participatory exercise as research studies have shown that students in traditional lecture courses learn comparatively little. The anti-lecture movement is fueled by the proliferation of online lectures, which threaten the monopoly on learning by self-sufficient campuses. Other scholars are looking to improve, rather than replace, the lecture model. For example, Johns Hopkins chemistry professor Jane Greco records her lectures and posts them online as homework, and uses her time in the lecture hall as an interactive discussion of the lab experiment students completed the previous session. To read further, please visit http://www.washingtonpost.com/local/education/colleges-looking-beyond-the-lecture/2012/02/03/gIQA7iUaGR_story.html.

Oak Ridge National Laboratory Computer Scientists Collect Computing Tools for Next-Generation Machines


Oak Ridge National Laboratory's new Titan supercomputer, which is based on a hybrid architecture of central and graphics processing units and is expected to be operational next year, will replace the lab's Jaguar system, which uses an entirely central processing unit-based platform. "Anything that tool developers can do to reduce the burden of porting codes to new architectures, while ensuring performance and correctness, allows us to spend more time obtaining scientific results from simulations," says Oak Ridge researcher Bronson Messer. The Oak Ridge team is working to ensure that researchers will not have to spend huge amounts of time learning how to use their codes during the shift to hybrid computing architectures. Many of the tools that are used on Jaguar will be used on Titan, but they will have to be scaled up for the larger machine. For example, the researchers have expanded the capabilities of debugging software to meet the needs of large leadership-class systems. To read further, please visit http://www.ornl.gov/info/features/get_feature.cfm?FeatureNumber=f20120214-00.

UCSD Study of Origin of Lava Formations in Western U.S. Yields New Model, Aided By XSEDE Resources

Scientists at Scripps Institution of Oceanography at the University of California San Diego used XSEDE resources, including Ranger and Lonestar supercomputers at the Texas Advanced Computing Center and Abe at the National Center for Supercomputing Applications, to create simulations that helped them discover a source of massive lava formations in the Western United States. To read further, please visit http://scrippsnews.ucsd.edu/Releases/?releaseID=1241.

TACC's Ranger Helps Predict Storms Intensity With Greater Accuracy

When Hurricane Irene swept through New England in August 2011, the National Hurricane Center (NHC) did an astounding job of predicting its path. However, Irene arrived significantly weaker than originally forecast, leading to a larger evacuation than would have occurred had NHC's intensity forecasts been closer to the mark.  “The National Hurricane Center has been doing an excellent job over the past few decades of persistently increasing the hurricane forecast track accuracy,” said Fuqing Zhang, professor of meteorology at the Pennsylvania State University. “But there have been virtually no improvements in the intensity forecast.” Predicting how hurricanes form, intensify, or dissipate is different and more challenging than predicting its path. To read further, please visit http://www.tacc.utexas.edu/news/feature-stories/2012/upgrading-the-hurricane-forecast.

Purdue University Researchers Develop New Imaging Reveals Early Changes Leading to Breast Tumors

University researchers have created a new imaging technology that reveals subtle changes in breast tissue, representing a potential tool to determine a woman's risk of developing breast cancer and to study ways of preventing the disease. The researchers, using a special "3-D culture" that mimics living mammary gland tissue, also showed that a fatty acid found in some foods influences this early precancerous stage. Unlike conventional cell cultures, which are flat, the 3-D cultures have the round shape of milk-producing glands and behave like real tissue, said Sophie Lelièvre (pronounced Le-LEE-YEA-vre), an associate professor of basic medical science.  To read further, please visit http://www.purdue.edu/newsroom/research/2012/120306LelievreBreastcancer.html.

Blue Waters “Earth Science System” Delivered to NCSA

The first cabinets of the new Blue Waters sustained-petascale supercomputer have arrived at the University of Illinois' National Center for Supercomputing Applications and were powered up over the last few days. A total of 48 Cray XE6 cabinets were installed. These cabinets represent about 15 percent of the final Blue Waters system. They will be operated as a limited access "Early Science System" while the rest of the Blue Waters supercomputer is built over the next several months. In March, select scientific teams from around the country will begin using the Early Science System on research in a range of fields. In parallel, the Cray and NCSA team will use the Early Science System to prepare for the operation of the full Blue Waters supercomputer. The Early Science System will ultimately be integrated into that sustained-petaflop supercomputer, which NCSA expects to operate for five years. To read further, please visit http://www.ncsa.illinois.edu/News/Stories/BW_ESS/.

Berkeley Scientists Discover an "Instant Cosmic Classic" Supernova Supercomputing

A supernova discovered two weeks ago is closer to Earth - approximately 21 million light-years away - than any other of its kind in a generation. Astronomers believe they caught the supernova within hours of its explosion, a rare feat made possible with a specialized survey telescope and state-of-the-art computational tools. The finding of such a supernova so early and so close has energized the astronomical community as they are scrambling to observe it with as many telescopes as possible, including the Hubble Space Telescope. Joshua Bloom, assistant professor of astronomy at the University of California, Berkeley, called it "the supernova of a generation." Astronomers at Lawrence Berkeley National Laboratory and UC Berkeley, who made the discovery, predict that it will be a target for research for the next decade, making it one of the most-studied supernova in history. To read further, please visit http://hpwren.ucsd.edu/news/20110825/.

Purdue University Researchers Focus On Humanoid Robots

Purdue University is involved in an international initiative to develop humanoid robots capable of responding to disasters. Purdue's focus is the creation of algorithms for the robot to climb an industrial ladder and traverse an industrial walkway, says Purdue professor C.S. George Lee. The algorithms are being used to program a HUBO II robot, and Lee says the project combines research in computer vision, locomotion and balance control, and machine learning. The effort is part of the U.S. Defense Advanced Research Projects Agency's Robotics Challenge. The researchers are using wireless technology to communicate with the robot. "There are numerous other potential applications for humanoid robots of the future, including space exploration, assisting the elderly, and working side-by-side with people in various environments," Lee notes. To read further, please visit http://www.purdue.edu/newsroom/releases/2012/Q4/humanoid-robots-are-focus-of-research-at-purdue.html.

Carnegie Mellon University, LANL and NSF Join Forces in Repurposing Supercomputers

Researchers at Los Alamos National Laboratory (LANL), the U.S. National Science Foundation, New Mexico Consortium, and Carnegie Mellon University (CMU) recently launched the Parallel Reconfigurable Observational Environment (PRObE) program; a supercomputer research center using a cluster of 2,048 recently retired supercomputers from LANL. "They decommission them every three or four years because the new computers make so much better results," says CMU professor Garth Gibson. PRObE partners successfully decommissioned and saved the computer clusters for reuse. Although the main facility will stay in Los Alamos, CMU's Parallel Data Lab in Pittsburgh will house two similar but smaller centers. The Pittsburgh facilities will enable researchers to perform small experiments and demonstrate to the PRObE committee that they are ready to request time on the facility in Los Alamos. PRObE's launch means that researchers will have the opportunity to experiment with supercomputers. "We are taking a resource, handing it to scientists and saying, 'Do your research on a dedicated facility,'" Gibson notes. To read further, please visit http://triblive.com/news/allegheny/2818020-74/computer-probe-research-alamos-facility-gibson-los-science-supercomputer-supercomputers#axzz2APhyFqK3

UC San Diego Researchers Launch Innovative, Hands-on Online Tool for Science Education

Computer scientists at the University of California, San Diego and at St. Petersburg Academic University in Russia, have developed a one-of-a-kind, hands-on online learning tool that weaves together for the first time science and programming education—and automatically grades homework too. “While modern biology is inundated with computation, biology students at U.S. universities are taught neither programming nor bioinformatics and as a result are unprepared for the challenges that await them in their own discipline,” said Pavel Pevzner, a computer science professor at the Jacobs School of Engineering at UC San Diego. “We provide a tool to fill that learning gap.” The new tool, called Rosalind (http://rosalind.info), diverges from large-scale, online open education platforms such as Coursera and Udacity. Instead of listening to a lecture, students are required to complete increasingly difficult problems at their own pace.  To read further, please visit http://www.jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=1278.

 

Comments
Trackback URL: