Thursday, October 18, 2012

With Halloween Around the Corner, Super Computers are getting "Brainy"

Just when you thought supercomputer technology yielded the optimum performance, a new project arises that is literally mind blowing. An article by CNN highlights the Human Brain Project, a billion-dollar plan to replicate the human brain inside a supercomputer.

What sounds like a science-fiction movie coming true, is actually a project geared toward curing conditions like depression, Parkinson's disease and Alzheimer's. This project may change the methods of medical imaging for those conditions because it attempts to simulate the tangles of neurons and synapses that power high-functioning thoughts, like moving and reasoning.
The Human Brain Project is piggy-backing off a project in Switzerland that studied tiny slivers of rodent gray matter and fed a computer with huge amounts of data and algorithms. Now that breakthroughs have been made in predicting thought, there is a need for more funding and supercomputers to expand the project.

The bold claims of the project have dubbed the scientists as "Team Frankenstein" and their computer as "Skynet," the artificial intelligence that unleashed a robot war upon Earth in the "Terminator" films.






Sean Hill, the neuroscientist on the project laughs at the comparisons and says that the supercomputer will mostly be used for educational purposes and it will allow scientists to conduct experiments without the need of probing in human skulls.

To read more about the project, check out the CNN article here.

Tuesday, October 16, 2012

Go Pink this October with Cancer-Analyzing Super Computers

NantHealth, a health IT company has teamed with AT&T, HP Intel and Verizon to create a supercomputer, similar to the Powerserve Quatro, that will reduce the time required to analyze genomic data of a patient with cancer. What would have normally taken eight weeks, now takes a mere 47 seconds.

Doctors have been unable to guide cancer treatment using genomic sequencing in the past because of the amount of time it took. This spurred a need for a solution. Teaming with several organizations, a high-speed, super-computer fiber network was created. This fiber network will provide thousands of oncology practitioners with information to fight cancer in a shorter amount of time.

It was reported that the ultimate storage solution helped the company to collect 96,512GB of data for over 3,000 patients and transfer it in under 70 hours. Mathematically, 5,000 patients could be analyzed per day with the technology.

So be sure to wear your pink with pride this October as it is Breast Cancer Month and spread the hope and the news about this new technological breakthrough in cancer treatment. To read more about NantHealth and the new technology, click here.



Mira, Mira on the Wall, What happened during the Big Bang?

An article recently appeared regarding a new super computer that will run the most complex universe simulation ever attempted –the epitome of a virtual machine. The supercomputer, Mira, is due to describe the origin, evolution, and structure of the 13-billion-year existence of the universe in about 2 weeks. The computation will begin in October, piggy-backing on other sky-mapping projects that provided a vast amount of knowledge on the structure of the current universe.


The challenge plaguing cosmologists throughout history is figuring out exactly what occurred in the first nascent galaxies. In order to explore the beginning, they have to build a new universe. They have crafted mathematical narratives that explain why some of the galaxies flew apart from one another while others clusters into what we see around us today. Mira's trillions-of-particles simulation will cram over 12 billion years worth of space evolution into just two weeks. Can you imagine the size of that cloud storage data center?

By the end of the two weeks, cosmologists hope that the final product will resemble what we see in the current universe mappings. With the technological advances, the supercomputers in the late 2010s are due to be a thousand times more powerful than Mira. These virtual universes will serve as testing grounds for some of the most sophisticated ideas about the cosmos ever.

To read more of the article on Mira by The Atlantic, click here.

Friday, September 21, 2012

A Supercomputing Revolution may make Star Trek's Warp Drive Possible

For all you Captain Kirk and supercharged workstation fans out there, moving faster than the speed of light may become a reality. The idea is related to the theory that the fabric of space (called space-time), can be manipulated to permit objects to move faster than photons of light.


Harold White, a NASA scientist believes he has created a feasible model for a warp drive. Piggy-backing off of the original model by Alcubierre, which consisted of a football-shaped craft attached to a ring that would constantly revolve around it. The ring would be made of some kind of exotic matter that would cause space-time to warp into a bubble. This bubble theoretically allows for the spaceship's engine to compress the space ahead of it and expand the space behind it. The ship would essentially move to another place without actually moving, pushing the space behind it. The problem with the old theory is that the mass-energy of Jupiter would be the amount needed to create the bubble.

 
White has transformed the original theory by suggesting the ring be more of a donut shape than flat. He has additionally suggested that "the intensity of the space warps can be oscillated over time," which would reduce the required energy for the craft. White and his team have been doing "tabletop experiments" using a "laser interferometer," essentially making mini-space-time warps.
If technology runs with this theory, Earthlings could potentially visit other stars on a craft about the size of the Voyager from 1977 and travel at 10 times the speed of light.  Can you imagine the speed of the high performance server on that craft?

To read the source for this post click here.

Tuesday, September 18, 2012

What's Shaking? It is Scientific Computational Modeling of Earthquakes.

Scientists utilize high-performance servers  and high-performance clusters to measure and predict the environment's activity. Earthquakes are an abnormal weather occurrence that certainly plagues the PSSC Labs headquarters in California on a pretty normal basis. While little is known about predicting earthquakes, theorists are beginning to question if oil drilling has had an effect on the increase of earthquake frequency. Oil & Gas computational modeling has been looked at as a tool for future studies. Whatever the reason behind the earthquakes, what is fascinating is how they happen.


An earthquake occurs when two blocks of the earth suddenly slip past one another on a "fault plane." While the edges of the faults are stuck together, and the rest of the block is moving, the energy that would normally cause the blocks to slide past one another is stored. When the force of the movement finally overcomes the friction of the jagged edges, the faults unstick and the stored energy releases causing the vibrations. The location below the ground where the earthquake begins is called the "epicenter," where there are three phases of shocks that occur. There are "foreshocks" which happen before the earthquake in the same location, the "main shock" which is the largest and true core of the earthquake. Depending on the size of the "mainshock," aftershocks continue for weeks, months, or even years. The energy of any shock radiates outward from the epicenter in all directions in seismic waves, much like ripples in a pond. As the waves move,  they shake the Earth, and anything on it.


Navigate through our website to learn more about our  high-performance servers  and high-performance clusters that assist in storing this fascinating information on nature.

Photo Credit: earthquake.usgs.gov & Science Photo Lab

Friday, September 14, 2012

Raspberry Pi Structure Turns the Heads of Supercharged Workstation Engineers

Apple, blueberry, lemon meringue? There are so many kinds of pies out there, but the favorite of those working with supercharged workstations is the Raspberry Pi. And no, this isn't surrounded with delicious graham cracker crust – it is a credit-card sized Linux machine that has become an instant hit. For the first time ever, someone has interconnected dozens of these little devices to build a mega super computer. And what did they use for the glue? LEGOs.


Seven computational engineers at the University of Southampton in the UK ordered 64 Raspberry Pis and created a rack for their super computer that would help interconnect their 64 Pis. While the Raspberry Pi certainly does not have the performance of our Powerwulf Clusters, the number of networked devices gives it a pretty good surge of power that was built from the ground up with toys! Although it isn't the strongest, it is certainly the most adorable.


This project was basically a smaller scale of what we, at PSSC Labs, do when we create our high performance servers and ultimate storage solutions. Feel free to navigate through our website and learn more about the process of computer building on a large scale.

Thursday, September 6, 2012

High Performance Computing happens more than ‘Once in a Blue Moon.’

Weather computational modeling and studies in astrophysics computing are progressing with the advent of high-performance computing. PSSC Labs supports research scientists in all varieties of the disciplines they study. Naturally, there was a lot of excitement when word spread about the appearance of a blue moon. The evening of August 31, 2012 brought a site to see.


A "blue moon" is the name given to the second full moon in a single month. It is a somewhat rare occurrence as it only happens once every three years.

Sometimes, however, the moon can appear to be visually blue in color. This hue is possible weather a moon is full or not. The event is caused by materials and particles in the atmosphere such as dust or smoke. The secret to witnessing a literal blue moon is by having the particles slightly wider than the wavelength of red light, with no other sizes present. On rare occurrences, volcanoes produce clouds that emit such particles. Forest fires also have the same effect, spreading a mixture of wide particles that create blue light. Most clouds pushed into the atmosphere contain a wider variety of particles, including many smaller than 1 micrometer. The smaller particles tend to scatter blue light. These clouds cause "red moons," which are much more common than a blue moon.

Navigate through the PSSC Labs website and learn more about the possibilities for high-performance servers and high performance clusters in these fascinating industries.

Photo Credit: Tomasjina/www.Space.com; www.pikespeakphoto.com/images/sunmoon/redmoon2.jpg