Thursday, October 18, 2012

With Halloween Around the Corner, Super Computers are getting "Brainy"

Just when you thought supercomputer technology yielded the optimum performance, a new project arises that is literally mind blowing. An article by CNN highlights the Human Brain Project, a billion-dollar plan to replicate the human brain inside a supercomputer.

What sounds like a science-fiction movie coming true, is actually a project geared toward curing conditions like depression, Parkinson's disease and Alzheimer's. This project may change the methods of medical imaging for those conditions because it attempts to simulate the tangles of neurons and synapses that power high-functioning thoughts, like moving and reasoning.
The Human Brain Project is piggy-backing off a project in Switzerland that studied tiny slivers of rodent gray matter and fed a computer with huge amounts of data and algorithms. Now that breakthroughs have been made in predicting thought, there is a need for more funding and supercomputers to expand the project.

The bold claims of the project have dubbed the scientists as "Team Frankenstein" and their computer as "Skynet," the artificial intelligence that unleashed a robot war upon Earth in the "Terminator" films.






Sean Hill, the neuroscientist on the project laughs at the comparisons and says that the supercomputer will mostly be used for educational purposes and it will allow scientists to conduct experiments without the need of probing in human skulls.

To read more about the project, check out the CNN article here.

Tuesday, October 16, 2012

Go Pink this October with Cancer-Analyzing Super Computers

NantHealth, a health IT company has teamed with AT&T, HP Intel and Verizon to create a supercomputer, similar to the Powerserve Quatro, that will reduce the time required to analyze genomic data of a patient with cancer. What would have normally taken eight weeks, now takes a mere 47 seconds.

Doctors have been unable to guide cancer treatment using genomic sequencing in the past because of the amount of time it took. This spurred a need for a solution. Teaming with several organizations, a high-speed, super-computer fiber network was created. This fiber network will provide thousands of oncology practitioners with information to fight cancer in a shorter amount of time.

It was reported that the ultimate storage solution helped the company to collect 96,512GB of data for over 3,000 patients and transfer it in under 70 hours. Mathematically, 5,000 patients could be analyzed per day with the technology.

So be sure to wear your pink with pride this October as it is Breast Cancer Month and spread the hope and the news about this new technological breakthrough in cancer treatment. To read more about NantHealth and the new technology, click here.



Mira, Mira on the Wall, What happened during the Big Bang?

An article recently appeared regarding a new super computer that will run the most complex universe simulation ever attempted –the epitome of a virtual machine. The supercomputer, Mira, is due to describe the origin, evolution, and structure of the 13-billion-year existence of the universe in about 2 weeks. The computation will begin in October, piggy-backing on other sky-mapping projects that provided a vast amount of knowledge on the structure of the current universe.


The challenge plaguing cosmologists throughout history is figuring out exactly what occurred in the first nascent galaxies. In order to explore the beginning, they have to build a new universe. They have crafted mathematical narratives that explain why some of the galaxies flew apart from one another while others clusters into what we see around us today. Mira's trillions-of-particles simulation will cram over 12 billion years worth of space evolution into just two weeks. Can you imagine the size of that cloud storage data center?

By the end of the two weeks, cosmologists hope that the final product will resemble what we see in the current universe mappings. With the technological advances, the supercomputers in the late 2010s are due to be a thousand times more powerful than Mira. These virtual universes will serve as testing grounds for some of the most sophisticated ideas about the cosmos ever.

To read more of the article on Mira by The Atlantic, click here.

Friday, September 21, 2012

A Supercomputing Revolution may make Star Trek's Warp Drive Possible

For all you Captain Kirk and supercharged workstation fans out there, moving faster than the speed of light may become a reality. The idea is related to the theory that the fabric of space (called space-time), can be manipulated to permit objects to move faster than photons of light.


Harold White, a NASA scientist believes he has created a feasible model for a warp drive. Piggy-backing off of the original model by Alcubierre, which consisted of a football-shaped craft attached to a ring that would constantly revolve around it. The ring would be made of some kind of exotic matter that would cause space-time to warp into a bubble. This bubble theoretically allows for the spaceship's engine to compress the space ahead of it and expand the space behind it. The ship would essentially move to another place without actually moving, pushing the space behind it. The problem with the old theory is that the mass-energy of Jupiter would be the amount needed to create the bubble.

 
White has transformed the original theory by suggesting the ring be more of a donut shape than flat. He has additionally suggested that "the intensity of the space warps can be oscillated over time," which would reduce the required energy for the craft. White and his team have been doing "tabletop experiments" using a "laser interferometer," essentially making mini-space-time warps.
If technology runs with this theory, Earthlings could potentially visit other stars on a craft about the size of the Voyager from 1977 and travel at 10 times the speed of light.  Can you imagine the speed of the high performance server on that craft?

To read the source for this post click here.

Tuesday, September 18, 2012

What's Shaking? It is Scientific Computational Modeling of Earthquakes.

Scientists utilize high-performance servers  and high-performance clusters to measure and predict the environment's activity. Earthquakes are an abnormal weather occurrence that certainly plagues the PSSC Labs headquarters in California on a pretty normal basis. While little is known about predicting earthquakes, theorists are beginning to question if oil drilling has had an effect on the increase of earthquake frequency. Oil & Gas computational modeling has been looked at as a tool for future studies. Whatever the reason behind the earthquakes, what is fascinating is how they happen.


An earthquake occurs when two blocks of the earth suddenly slip past one another on a "fault plane." While the edges of the faults are stuck together, and the rest of the block is moving, the energy that would normally cause the blocks to slide past one another is stored. When the force of the movement finally overcomes the friction of the jagged edges, the faults unstick and the stored energy releases causing the vibrations. The location below the ground where the earthquake begins is called the "epicenter," where there are three phases of shocks that occur. There are "foreshocks" which happen before the earthquake in the same location, the "main shock" which is the largest and true core of the earthquake. Depending on the size of the "mainshock," aftershocks continue for weeks, months, or even years. The energy of any shock radiates outward from the epicenter in all directions in seismic waves, much like ripples in a pond. As the waves move,  they shake the Earth, and anything on it.


Navigate through our website to learn more about our  high-performance servers  and high-performance clusters that assist in storing this fascinating information on nature.

Photo Credit: earthquake.usgs.gov & Science Photo Lab

Friday, September 14, 2012

Raspberry Pi Structure Turns the Heads of Supercharged Workstation Engineers

Apple, blueberry, lemon meringue? There are so many kinds of pies out there, but the favorite of those working with supercharged workstations is the Raspberry Pi. And no, this isn't surrounded with delicious graham cracker crust – it is a credit-card sized Linux machine that has become an instant hit. For the first time ever, someone has interconnected dozens of these little devices to build a mega super computer. And what did they use for the glue? LEGOs.


Seven computational engineers at the University of Southampton in the UK ordered 64 Raspberry Pis and created a rack for their super computer that would help interconnect their 64 Pis. While the Raspberry Pi certainly does not have the performance of our Powerwulf Clusters, the number of networked devices gives it a pretty good surge of power that was built from the ground up with toys! Although it isn't the strongest, it is certainly the most adorable.


This project was basically a smaller scale of what we, at PSSC Labs, do when we create our high performance servers and ultimate storage solutions. Feel free to navigate through our website and learn more about the process of computer building on a large scale.

Thursday, September 6, 2012

High Performance Computing happens more than ‘Once in a Blue Moon.’

Weather computational modeling and studies in astrophysics computing are progressing with the advent of high-performance computing. PSSC Labs supports research scientists in all varieties of the disciplines they study. Naturally, there was a lot of excitement when word spread about the appearance of a blue moon. The evening of August 31, 2012 brought a site to see.


A "blue moon" is the name given to the second full moon in a single month. It is a somewhat rare occurrence as it only happens once every three years.

Sometimes, however, the moon can appear to be visually blue in color. This hue is possible weather a moon is full or not. The event is caused by materials and particles in the atmosphere such as dust or smoke. The secret to witnessing a literal blue moon is by having the particles slightly wider than the wavelength of red light, with no other sizes present. On rare occurrences, volcanoes produce clouds that emit such particles. Forest fires also have the same effect, spreading a mixture of wide particles that create blue light. Most clouds pushed into the atmosphere contain a wider variety of particles, including many smaller than 1 micrometer. The smaller particles tend to scatter blue light. These clouds cause "red moons," which are much more common than a blue moon.

Navigate through the PSSC Labs website and learn more about the possibilities for high-performance servers and high performance clusters in these fascinating industries.

Photo Credit: Tomasjina/www.Space.com; www.pikespeakphoto.com/images/sunmoon/redmoon2.jpg

Tuesday, September 4, 2012

What is the Higgs Boson and why is it important to the Life Science Industry?

Users of PSSC Life Sciences Tools are talking about what could be the next big discovery in their industry. The Large Hadron Collider is a giant science instrument that utilizes some incredible hpc supercomputers, for studying the smallest existing particles. Researchers at the lab say that they are certain about the existence of the Higgs boson, or the "God particle." This is a never-before-seen subatomic particle that is allegedly the fundamental building block of the universe.

The theory of the particle was first proposed by a physicist names Peter Higgs in the 1960s to provide explanation to how particles obtain mass. The theory is that an energy field exists all around in the universe. As particles zoom around the field, they interact and attract Higgs bosons, which cluster in varying numbers. The more Higgs bosons that a particle attracts, the greater the mass will become.

While this discovery won't give human being the entire knowledge about the universe and its operations, it will fill a huge hole in the Standard Model that has been in place for over 50 years.

However, it is going to take some major effort and some intense computing power to track down the Higgs boson. Scientists describe them as elusive, is that they appear and disappear quickly and scientists are only able to study disintegrating remnants.

Finding a missing block in the Standard Model could possibly lead scientists to study dark matter: mysterious, invisible substance that adds up to five times as much mass as the ordinary atoms and particles to which we are accustomed. With the amount of information that can result from this potential discovery, science is going to need ultimate storage solutions for sure!

For more information on life science supplies, explore www.pssclabs.com.

Monday, August 27, 2012

Watson’s Complex Data Center Allows for Human Interaction and an Evolution in the Medical Industry

What is "a super smart computer?" For those that don’t already know Watson, he is a cognitive system with a high-performance supercomputer that made a television debut on Jeopardy a few years back. Completely outscoring the other contestants on the show, the computer genius put on quite a show. As of the Summer of 2012, specialists are saying that Watson may transform how organizations operate in the future and surge their current high performance servers and supercharged workstations.


The Watson project began in 2006 with a team of researchers from leading universities. The system developed the ability to discern double meaning of words, puns, rhymes, and inferred hints at a rapid pace. Of course, as human beings, people gain these skills from a lifetime of social building. To make Watson able to do the same, developers had to focus on natural language processing, hypothesis generation, and evidence-based learning.
So what is next for Watson? Since June there is discussion on how Watson's artificial intelligence may assist doctors in rustling through medial information and loads of research data to help apply knowledge to treating patients. At this stage, Watson still has a few years of development before reaching that goal. Hospitals will still have to rely on medical imaging tools and archiving solutions. Here at PSSC we are interested in bringing the newest and brightest technologies to market including our cloud storage data center.

Friday, August 10, 2012

High Performance Computer Questions about the Curiosity Rover on Mars

With the news of Curiosity landing on Mars, the high performance server community is just giddy and those utilizing supercharged workstations are "curious" about what is aboard the Curiosity. The Mars Science Laboratory rover is assessing the planet's environment and if the planet has ever been a habitat for life. Its mission is separated into four parts:

A.    Access the biological potential of at least one target area of the planet.
B.    Characterize the geology of the landing site.
C.    Investigate past planetary processes relevant to being a habitat.
D.    Characterize the surface radiation in the Mars environment.

Curiosity's Earth weight is 165 lbs and its tools are capable of verifying conditions that would be needed for Mars life: liquid water, necessary chemicals, and an energy source. Listed below are eight science instruments aboard Curiosity.

1.    Antennas: There are three Antennas aboard the craft that transmit data to orbiting Mars Satellites that relay information directly to Earth.
2.    Nuclear Battery: The battery on the rear of the craft is quite powerful, providing up to 14 years of electric power.
3.    Mobility System: Curiosity is about the size of a Mini Cooper, but the mobility system is quite different to cater to the distant terrain. There are six wheels that drive the rover at a top speed of 1.5 inches per second, and can cover at least 12 miles of Martian terrain.
4.    Jointed Robotic Arm with Instrument Turret: This arm spans to seven feet and carries a rock drill, soil sampling scoop, radiation-emitting experiment and a camera equipped with a magnifying lens.
5.    Mast: This tool carries wide-angle and telephoto digital cameras as well as a powerful infared laser for analyzing rock compositions.
6-7.    2-Megapixel Mastcams: These cameras tower 6.5 feet above the Martian surface, capturing 360-degree panorama, stills, and HD video.
8.    Hardware: Curiosity is powered by a RAD750, a single-board computer. It is currently one of the most popular computers for space crafts, being hundreds of times faster than the Apollo Guidance Computer in the Moon landings. It can withstand high levels of radiation and temperatures between 55 degrees below and 70 degrees above freezing. In case something happens to this computer despite preparations, there is a second RAD750 that will take over if anything happens to the first computer.

So are they going to find life on Mars? Not on this trip. The rover does not have the ability to detect present-day life or fossilized microorganisms. Only time and experience can answer those questions bordering imagination. Like PSSC's Computational Chemistry workstation, Earthlings can gain superior knowledge through greater experiences spurred by curiosity.

==

For more details about PSSC Labs' high performance servers, supercharged workstations and industry leading clusters and clouds, visit us online at http://pssclabs.com.

Friday, August 3, 2012

Massive Memory for High-Performance Servers

August 3, 2012 | PSSC Labs is releasing massive memory systems for high-performance servers and computers. These systems are the PowerServe Quattro Servers. To put their memory power into perspective, a standard Windows HP computer has around 150 megabytes of memory storage. These devices contain 512 gigabytes, which translates to almost 3500 times the amount of memory storage as a standard computer, pretty impressive. The servers support up to 64 processor cores and up to 40 terabytes of redundant storage space – which is ideal when saving expensive data. No one wants to lose expensive research to a technical glitch

These PowerServe Quattro Servers are even more impressive in that they are flexible. They can be custom configured to meet any budget or specification; they can also be shipped and supported anywhere in the world. You may be wondering, who needs this much storage? Well PSSC Labs are actively listening to their clients. They attended a Genomics conference where concerns were frequently expressed about the need for large memory systems. PSSC answered the call by releasing these powerful memory devices. With that amount of memory, it is tough to forget about PSSC Labs. Be sure to explore the options for Ultimate Storage Solutions in your industry.

Description: Catered massive memories for high-performance computers or servers with 512 gigabytes of powerful memory storage.

==

For more details about PSSC Labs' high performance servers, supercharged workstations and industry leading clusters and clouds, visit us online at http://pssclabs.com.

Friday, July 27, 2012

PSSC Labs Expands the Computing Capabilities of CIW's PowerWulf Cluster

July 27, 2012 | Flashback to November of 2010 and PSSC Labs are ahead of the game. Computer clusters need more processing power over time in order to remain updated for scientific discoveries. When budget became available to the Carnegie Institute of Washington's (CIW) Department of Terrestrial Magnetism, they contacted PSSC Labs to expand the computing capability of their existing PowerWulf Cluster. At the time, there were over 1,000 PowerWulf Clusters operational.

In 2009, CIW purchased a 96 processor core, 192 GB memory
PowerWulf Cluster. The primary, research use for the cluster is "adaptive mesh refinement hydrodynamics code for studying mixing and transport processors in the presolar cloud and solar nebula." Sounds pretty smart, doesn’t it? That’s because it is. The cluster is used for the ability to help understand the formation of the nebulas. In case you don’t know, nebulas are regions or clouds of intersteller dust and gas. In order to study and create scientific breakthroughs about the universe and the environment, scientists need computing power to make calculations that normally would take a regular computer system years to process. When companies use clusters, such as Powerwulf, these calculations can be done in days or even hours


With such a huge project, a primary concern is the time requirements to manage, monitor and maintain the computer cluster. PSSC Labs understands the need to deliver a stable solution requiring as little updates as possible. Just when you thought technology was getting better, faster, and more reliable, PSSC came out with the more advanced, PowerWulf Cluster that has rolled with technological advancements for the past two years. Think about how much your cell phone has changed in the past two years – pretty dramatically. Yet the advancement of the PowerWulf has maintained its state-of-the-art quality, giving scientists a cosmic bang for their buck.

==

For more details about PSSC Labs' high performance servers, supercharged workstations and industry leading clusters and clouds, visit us online at http://pssclabs.com.

Tuesday, July 10, 2012

Weather Solutions

July 10, 2012 | In case you haven’t noticed, weather conversations are everywhere - from the segments on the news, to global-warming panic, to the go-to conversation with a stranger on the bus. But many have never thought about the tools used to gain the knowledge in the first place. Like most fields of study, weather research is rapidly advancing with the technological evolution of high-performance servers and computers. 

PSSC Labs has developed models that specialize in varying weather investigations including the MesoScale Modeler, the StormScale Modeler and the Vortex Cluster. These are relied on for dependable information in forecasting and air-quality assessments for environmental analysts. Why is this advanced information important? Why can’t we just look outside for the weather? Because with improved technology comes better extreme weather forecasts for the people that can be effected. On May 22, 2011 in Joplin, Mo., a tornado with 200-plus miles per hour winds claimed the lives of 160 people that only received a 20-minute tornado warning. 20-minutes is not enough time for anyone to grab a family and save themselves. By continuing the improvement of weather-forecast systems, these casualties can be prevented.



Remember hurricane Katrina? It would be nice if future generations knew that a storm the size of Texas was going to hit them earlier than later. Depending on what your opinion is on global warming’s existence; the weather may or may not become more extreme with time. Advanced notice in weather preparation can correlate with technological advancement for weather forecasting, modeling and research. For more information on PSSC weather related products, be sure to view the Weather Computation Modeling page in our website.

==

For more details about PSSC Labs' high performance servers, supercharged workstations and industry leading clusters and clouds, visit us online at http://pssclabs.com.

Monday, July 2, 2012

Ingram Micro Offers Up Turnkey Servers and Clusters

July 2, 2012 | Leveraging its alliance with System ArchiTECHS member PSSC Labs and synergies across multiple divisions, Ingram Micro Inc. today announced a new portfolio of customizable, enterprise-grade high performance computing solutions and services.

Available now to channel partners in the U.S., the distributor's new and exclusive line of Artizen High Performance Computing (HPC) offerings include turnkey high performance servers, ultimate workstations, and customizable supercomputing clusters, as well as computing integration and software installation services.

Designed exclusively for Ingram Micro by PSSC Labs, the new HPC solutions and complementary services will help channel partners more effectively identify, compete and win additional business when it comes to big data, datacenter and private cloud opportunities.

"Ingram Micro's new Artizen HPC offerings enable our channel partners to meet the advanced computing needs of the marketplace without having to invest in the added infrastructure, certifications and personnel to support these higher-end technology solutions services," says Paul Bay, executive vice president, Ingram Micro North America. "The investments we've made and continue to make in advanced computing, cloud and enterprise services, combined with our new alliance with PSSC Labs, are simplifying success for our channel partners competing in this space, and will ultimately open up additional higher-margin opportunities for Ingram Micro, our vendors and solution providers throughout the U.S."

"As technology advances and the need for greater usage capacity grows, demand for high performance computing is rising -- presenting a multimillion dollar opportunity for system builders and VARs, but also a new set of business challenges around organizational scale and working capital," says Alex Lesser, vice president, sales and marketing, PSSC Labs, a well-established and highly regarded national service provider of HPC solutions located in Lake Forest, Calif. "Having the support of Ingram Micro and PSSC Labs will give channel partners the backing and go-to-market strength they need to compete head-to-head in this market and provide clients greater business value and more personalized service."

For more information about Ingram Micro's new portfolio of Artizen High Performance Computing offerings, solution providers and manufacturers should contact their Ingram Micro sales representative or email HPC@ingrammicro.com.

Additional insight on the Ingram Micro System ArchiTECHS community is available online at www.facebook.com/IMSystemArchiTECHS.

More information about Ingram Micro is available at www.ingrammicro.com and http://ingrammicroinc.wordpress.com.

To learn, see and hear more about Ingram Micro online, follow the distributor on Facebook page at www.facebook.com/IngramMicro ; Twitter at www.twitter.com/IngramMicroInc ; and YouTube at http://www.youtube.com/user/ingrammicroinc.

About Ingram Micro Inc.
As a vital link in the technology value chain, Ingram Micro creates sales and profitability opportunities for vendors and resellers through unique marketing programs, outsourced logistics, technical and financial support, managed and cloud-based services, and product aggregation and distribution. The company is the only global broad-based IT distributor, serving more than 145 countries on six continents with the world's most comprehensive portfolio of IT products and services. Visit www.ingrammicro.com.


==

For more details about PSSC Labs' high performance servers, supercharged workstations and industry leading clusters and clouds, visit us online at http://pssclabs.com.

Tuesday, January 17, 2012

PSSC Labs Revolutionary PowerServe DUO T2000 Server Featured on Processor.com

January 17, 2012 | Processor.com published an excellent review of our new server along with OCZ Technology SSD's for the cloud computing market.

Read the full article here.

==

For more details about PSSC Labs' high performance servers, supercharged workstations and industry leading clusters and clouds, visit us online at http://pssclabs.com.