Invalid post type: resource_post_type

ViON on Demand is a service that allows IT organizations to dynamically order and use IT infrastructure – server, storage compute and data center networking – as needed, scaling its usage up and down to align with the organization’s unique requirements. Not only can the customer have this IT infrastructure installed on-premise, ViON on Demand allows for a high level of customization to suit the specific environment and need. […]

Invalid post type: resource_post_type

The Internal Revenue Service’s research unit has a suitably expansive agenda for big data. The tax agency pulls in voluminous data on revenue collection, refunds, and enforcement efforts. The task of the IRS Research, Analysis, and Statistics division is to sort through that data and help the tax agency make better decisions. The division’s responsibilities include econometric modeling, forecasting, and compliance studies. It also serves as IRS’ focal point for developing research databases and supporting infrastructure. The research group taps big data to support its activities. The data-driven approach promotes greater efficiency in resources used for tax administration, according to Jeff Butler, director of Research Databases within the IRS Research, Analysis, and Statistics division.






[…]

Invalid post type: resource_post_type

NASA’s Jet Propulsion Laboratory, like many large organizations, is taking on the Big Data problem: the task of analyzing enormous data sets to find actionable information. In JPL’s case, the job involves collecting and mining data from 22 spacecraft and 10 instruments including the Mars Science Laboratory’s Curiosity rover and the Kepler space telescope. Tom Soderstrom, IT chief technology officer at JPL, joked that his biggest Big Data challenge is more down to Earth: dealing effectively with his email inbox. But kidding aside, JPL now confronts Big Data as a key problem and a key opportunity. “If we define the Big Data era as beginning where our current systems are no longer effective, we have already entered this epoch,” Soderstrom explained.






[…]

Invalid post type: resource_post_type

For a tough big data challenge, look no further than the U.S. Postal Service (USPS). USPS faces a classic double-whammy: the agency has to collect and crunch massive amounts of data, and tackle the job quickly. Speed is important as the agency aims to detect fraud, update customers tracking mail pieces, and respond to requests from regulators. The postal service has responded with an architectural approach designed to rapidly ingest and process data culled from thousands of sources throughout the postal enterprise. Scot Atkins, program manager, Supercomputing & Revenue Protection, USPS, has helped shape the postal service’s big data efforts. He cited the agency’s push for real-time processing as perhaps its biggest big data challenge.






[…]

Invalid post type: resource_post_type

MeriTalk sat down with Scott Pearson, director, big data solutions, Brocade to discuss the state of Big Data in the Federal government: What is the most interesting thing about Big Data? Who is driving Big Data adoption at agencies? What should IT keep in mind as they look to deliver Big Data solutions?






[…]

Invalid post type: resource_post_type

Big data has less to do with size and more to do with the growing recognition that data and analysis have a seemingly limitless potential for improving government and society. But data alone does not deliver value. Real value is created when government can bring together data – big or traditional – from multiple sources or locations, and present that information in a way that encourages exploration and insight. Qlik allows you to extend big data analytics to the edges of your agency. Read Qlik’s case study to learn more.






[…]

Invalid post type: resource_post_type

Data has become a critical advantage for information-driven agencies. By providing unprecedented access to actionable information, agencies can use data to better understand their operations, improve their services, and ultimately fulfill their mission requirements. To access this information, agencies need to be able to effectively operationalize data across their operations. This includes discovering and embedding past-, present-, and future-looking analytics into their end users’ workflow in order to move the metrics that matter.






[…]

Invalid post type: resource_post_type

Government data is growing and agencies are looking to leverage big data to support government mission outcomes. However, most agencies lack the data storage/access, computational power, and personnel they need to take advantage of the big data opportunity, according to a study by MeriTalk sponsored by NetApp. The report, “The Big Data Gap,” reveals that Federal IT professionals believe big data can improve government but that the promise of big data is locked away in unused or inaccessible data.






[…]

Invalid post type: resource_post_type

Unlike its predecessors, big data has emerged as more than just a new technology. It has proven to be one of the most promising yet challenging technologies for both government and industry. Before IT leaders can harness the full potential of big data, there are key issues to address surrounding infrastructure, storage, and training. MeriTalk surveyed 17 visionary big data leaders to find out what they see as the big data challenges and opportunities as well as how government can best leverage big data.






[…]

Invalid post type: resource_post_type

Despite the buzz, big data is a new concept and many state and local agencies are behind the curve. Although thought leaders are extolling the virtues and benefits of big data, the truth is that to capture the full potential of big data, state and local agencies need the ability to easily store and access data, robust computational power and software to manipulate the data, and trained personnel to analyze the data. MeriTalk surveyed state and local IT professionals to better understand the current data position for state and local agencies and to identify the current gap between the big data opportunity and reality.






[…]

1 2 3 4