|By Reuven Cohen||
|May 21, 2009 02:00 PM EDT||
I'm happy to announce that the U.S. Federal Government earlier today launched the new Data.Gov website. The primary goal of Data.Gov is to improve access to Federal data and expand creative use of those data beyond the walls of government by encouraging innovative ideas (e.g., web applications). Data.gov strives to make government more transparent and is committed to creating an unprecedented level of openness in Government. The openness derived from Data.gov will strengthen the Nation's democracy and promote efficiency and effectiveness in Government.
As a priority Open Government Initiative for President Obama's administration, Data.gov increases the ability of the public to easily find, download, and use datasets that are generated and held by the Federal Government. Data.gov provides descriptions of the Federal datasets (metadata), information about how to access the datasets, and tools that leverage government datasets. The data catalogs will continue to grow as datasets are added. Federal, Executive Branch data are included in the first version of Data.gov.
Public participation and collaboration will be one of the keys to the success of Data.gov. Data.gov enables the public to participate in government by providing downloadable Federal datasets to build applications, conduct analyses, and perform research. Data.gov will continue to improve based on feedback, comments, and recommendations from the public and is activily encouraging individuals to suggest datasets they'd like to see, rate and comment on current datasets.
In a recent interview on NextGov.com, Federal CIO Vivek Kundra shed some details on Data.Gov & Open Government Initiative. "We recognize the power of tapping into the ingenuity of the American people and recognize that government doesn't have a monopoly on the best ideas or always have the best idea on finding an innovative path to solving the toughest problems the country faces. By democratizing data and making it available to the public and private sector ... we can tap into that ingenuity."
One of the most telling aspects of the new Data.Gov website is the Data policy which outlines a broad usage policy which states data accessed through Data.gov do not, and should not, include controls over its end use. In a sense the U.S. federal government has open sourced large portions of public information. It will be very interesting to see how people use this fountain of information.
- Public Information
All datasets accessed through Data.gov are confined to public information and must not contain National Security information as defined by statute and/or Executive Order, or other information/data that is protected by other statute, practice, or legal precedent. The supplying Department/Agency is required to maintain currency with public disclosure requirements.
All information accessed through Data.gov is in compliance with the required confidentiality, integrity, and availability controls mandated by Federal Information Processing Standard (FIPS) 199 as promulgated by the National Institute of Standards and Technology (NIST) and the associated NIST publications supporting the Certification and Accreditation (C&A) process. Submitting Agencies are required to follow NIST guidelines and OMB guidance (including C&A requirements).
All information accessed through Data.gov must be in compliance with current privacy requirements including OMB guidance. In particular, Agencies are responsible for ensuring that the datasets accessed through Data.gov have any required Privacy Impact Assessments or System of Records Notices (SORN) easily available on their websites.
- Data Quality and Retention
All information accessed through Data.gov is subject to the Information Quality Act (P.L. 106-554). For all data accessed through Data.gov, each agency has confirmed that the data being provided through this site meets the agency's Information Quality Guidelines.
As the authoritative source of the information, submitting Departments/Agencies retain version control of datasets accessed through Data.gov in compliance with record retention requirements outlined by the National Archives and Records Administration (NARA).
- Secondary Use
Data accessed through Data.gov do not, and should not, include controls over its end use. However, as the data owner or authoritative source for the data, the submitting Department or Agency must retain version control of datasets accessed. Once the data have been downloaded from the agency's site, the government cannot vouch for their quality and timeliness. Furthermore, the US Government cannot vouch for any analyses conducted with data retrieved from Data.gov.
- Citing Data
The agency's preferred citation for each dataset is included in its metadata. Users should also cite the date that data were accessed or retrieved from Data.gov. Finally, users must clearly state that "Data.gov and the Federal Government cannot vouch for the data or analyses derived from these data after the data have been retrieved from Data.gov."
- Public Participation
In support of the Transparency and Open Government Initiative, recommendations from individuals, groups and organizations regarding the presentation of data, data types, and metadata will contribute to the evolution of Data.gov.
- Applicability of this Data Policy
Nothing in this Data Policy alters, or impedes the ability to carry out, the authorities of the Federal Departments and Agencies to perform their responsibilities under law and consistent with applicable legal authorities, appropriations, and presidential guidance, nor does this Data Policy limit the protection afforded any information by other provisions of law. This Data Policy is intended only to improve the internal management of information controlled by the Executive Branch of the Federal Government and it is not intended to, and does not, create any right or benefit, substantive or procedural, enforceable at law or in equity, by a party against the United States, its Departments, Agencies, or other entities, its officers, employees, or agents.
Entuity®, a provider of enterprise-class network management solutions, today announced that it solidifies its position as a market leader through global enterprise customer acquisitions and a refined channel strategy. In 2014, Entuity increased new license revenues in EMEA by over 75 percent, and LATAM by over 125 percent as customers embraced Entuity for its highly automated solution and unified architecture. Entuity’s refined channel strategy focuses on even deeper strategic alignment with ke...
Jan. 29, 2015 05:00 PM EST Reads: 485
Log data provides the most granular view into what is happening across your systems, applications, and end users. Logs can show you where the issues are in real-time, and provide a historical trending view over time. Logs give you the whole picture. Logentries, a log management and analytics service built for the cloud, has announced a new integration with Slack, the team communication platform, to enable real-time system and application monitoring. Users of both services can now receive real-...
Jan. 29, 2015 04:45 PM EST Reads: 488
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Jan. 29, 2015 03:30 PM EST Reads: 5,071
Cloud Technology Partners on Wednesday announced it has been recognized by the Modern Infrastructure Impact Awards as one of the Best Amazon Web Services (AWS) Consulting Partners. Selected by the editors of TechTarget's SearchDataCenter.com, and by votes from customers and strategic channel partners, the companies acknowledged by the Modern Infrastructure Impact Awards represent the top providers of cloud consulting services for AWS including application migration, application development, inf...
Jan. 29, 2015 03:00 PM EST Reads: 2,462
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 29, 2015 02:45 PM EST Reads: 4,051
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 29, 2015 02:30 PM EST Reads: 4,363
"Blue Box has been around for 10-11 years, and last year we launched Blue Box Cloud. We like the term 'Private Cloud as a Service' because we think that embodies what we are launching as a product - it's a managed hosted private cloud," explained Giles Frith, Vice President of Customer Operations at Blue Box, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 29, 2015 02:30 PM EST Reads: 4,095
In this demo at 15th Cloud Expo, John Meza, Product Engineer at Esri, showed how Esri products hook into Hadoop cluster to allow you to do spatial analysis on the spatial data within your cluster, and he demonstrated rendering from a data center with ArcGIS Pro, a new product that has a brand new rendering engine.
Jan. 29, 2015 02:30 PM EST Reads: 2,788
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your onlin...
Jan. 29, 2015 02:15 PM EST Reads: 5,156
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
Jan. 29, 2015 02:00 PM EST Reads: 3,760
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore's Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at Big Data Expo, Mason Katz, CTO and co-founder of StackIQ, disc...
Jan. 29, 2015 02:00 PM EST Reads: 4,927
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by minin...
Jan. 29, 2015 02:00 PM EST Reads: 5,749
The move in recent years to cloud computing services and architectures has added significant pace to the application development and deployment environment. When enterprise IT can spin up large computing instances in just minutes, developers can also design and deploy in small time frames that were unimaginable a few years ago. The consequent move toward lean, agile, and fast development leads to the need for the development and operations sides to work very closely together. Thus, DevOps become...
Jan. 29, 2015 02:00 PM EST Reads: 4,369
"We are the top stocking distributor for HP renew products in North America. We can only sell to U.S. authorized partners and resellers for HP," explained Miguel Diazdelcastillo Jr., Sales Executive at Creative Business Solutions, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 29, 2015 02:00 PM EST Reads: 3,651
Puppet Labs on Wednesday released the DevOps Salary Report, based on salary data gathered from Puppet Labs' industry-recognized State of DevOps Report. The data confirms that market demand for DevOps skills is growing, and that DevOps engineers are among the highest paid IT practitioners today. That's because IT organizations today are grappling with how to be more agile and responsive to the business, while maintaining the stability of their infrastructure. DevOps practices, such as continuous ...
Jan. 29, 2015 02:00 PM EST Reads: 2,332
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Jan. 29, 2015 01:30 PM EST Reads: 3,214
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Jan. 29, 2015 01:30 PM EST Reads: 4,730
The cloud is becoming the de-facto way for enterprises to leverage common infrastructure while innovating and one of the biggest obstacles facing public cloud computing is security. In his session at 15th Cloud Expo, Jeff Aliber, a global marketing executive at Verizon, discussed how the best place for web security is in the cloud. Benefits include: Functions as the first layer of defense Easy operation –CNAME change Implement an integrated solution Best architecture for addressing network-l...
Jan. 29, 2015 01:30 PM EST Reads: 3,496
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
Jan. 29, 2015 01:30 PM EST Reads: 4,842
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete...
Jan. 29, 2015 01:15 PM EST Reads: 4,215