The new catch phrase for corporate America is “big data.”

The ability to analyze massive amounts of information collected from an ever-growing number of sources can give an edge to startups like Bonobos.com, the Manhattan-based online retailer of men’s clothing where my son works, and SmartyPig, the West Des Moines social network/banking business started by Michael Ferrari and Jon Gaskell.

Big data is also securing the future of financial services giants, like Des Moines-based Principal Financial Group Inc.

My first exposure to the term was a January news release from Iowa State University announcing receipt of a $2 million grant that will allow university researchers to sort trillions of bytes of data that help determine which plant seeds grow best in which soil and climate conditions.

The Internet has many references to “big data,” including one in which a University of California, Berkeley, professor credits “big data” with helping President Obama win re-election by helping his campaign find “low-information independents” and target them by “running ads on Jimmy Kimmel and the rerun cable network TV Land.”

Big data would not be possible without the supercomputing pioneered by the U.S. Department of Energy in the 1990s and the cloud computing developed by the private sector during the past decade.

Improvements in computing power and storage capacity allow programmers to create and analyze enormous amounts of data.

The Economist reported on the phenomena last year noting: “The world creates 5 exabytes of data every two days. That is roughly the same amount created between the dawn of civilization and 2003.” (An exabyte is a 1 followed by 18 zeros.)

Big data can be text, sound or video. Just think about how many times a day you write something on a computer or text message. Think about the number of conversations that are recorded and the growing number of video cameras in public places.

So what can you do with all that data?

Principal Financial Group has several big data initiatives, Gary Scholten, the insurer’s chief information officer, said. They include:

• Risk management. Principal runs massive amounts of data through complex mathematical formulas to predict financial outcomes.

• Cybersecurity. Technicians stay ahead of hackers by plowing through massive amounts of unstructured data to discover weaknesses and fix them.

• Digital oversight of sales. Computers can analyze the large volume of data – emails, text messages, voice mail, even video – generated during the normal course of business, and provide new insight into what works and what doesn’t.

• Near-real-time solutions. By merging and coordinating different databases, Principal hopes to speed up and improve the approval process for customers applying for life insurance products.

Another goal, Scholten said, is to eventually be able to create video databases that can be searched for specific content as well as by time and location.

Data storage for companies like Principal is growing at a rate of 40-50 percent a year, according to Scholten.

The volume and variety of data create tremendous opportunities for businesses that know how to use it, he said. But here’s the rub. Right now there is more data than there are people who know how to use it.

Of 752 executives surveyed by The Economist last year, 41 percent said, “A lack of skilled staff hampers their attempts to process data more rapidly.”