Welcome!

Big Data Journal Authors: Elizabeth White, Pat Romanski, Kevin Benedict, Andreas Grabner, Liz McMillan

Blog Feed Post

Big Data accessiblity for SEC reporting? Not yet. Columbia report explains why.

By

[This post by Hudson Hollister is cross-posted on the Data Transparency Coalition's blog.]

Last Tuesday Columbia Business School’s Center for Excellence in Accounting and Security Analysis released a definitive report evaluating the implementation of a structured data format for the financial statements that public companies file with the U.S. Securities and Exchange Commission. Over a year in the making and based on extensive discussions and surveys with corporate filers, investors, data and filing vendors, regulators, and others, the survey illuminates the promise of structured data to better serve investors, improve the enforcement of securities laws, and make the U.S. capital market more efficient. It also reveals serious flaws in the SEC’s approach thus far – flaws which have prevented the promise from being realized.

Data Transparency CoalitionThe Columbia report is a call to action by both the SEC and Congress. The Data Transparency Coalition is going to pursue that action in 2013.

In 2009, the SEC adopted a requirement for public companies to file each financial statement in the eXtensible Business Reporting Language (XBRL) alongside the regular plain-text version. The requirement was slowly phased in over four years, starting with the largest companies and eventually covering all public companies. The XBRL format imposes a data structure on the financial statements and their notes and footnotes by assigning electronic tags to each item and defining how the items relate to one another.

Judging by potential impact, this is the most ambitious data transparency program ever undertaken by the U.S. government. The XBRL reporting requirement transformed all of the public financial statements in the world’s largest capital market from cumbersome text, which must be manually transcribed to allow quantitative analysis by investors and regulators, into an open, standardized, machine-readable format.

In theory, replacing unstructured text with structured data should, by now, have triggered revolutions and disruptions all over the financial industry. The SEC’s XBRL reporting requirement should, by now, have opened up corporate financial statements in the United States to Big Data platforms and applications.

  • Investors and analysts serving them should, by now, have started using powerful new software tools to compare and analyze the newly-structured financial statements – and to mash financial figures together with other data sources. They should be making better decisions, evaluating a broader universe of companies, and democratizing the financial industry.
  • Aggregators like Bloomberg and Google Finance should, by now, have started saving money and improving accuracy by ingesting corporate financial data directly from the SEC’s structured XBRL feed instead of manually entering the numbers into their own systems (or paying someone else to do that).
  • The SEC should, by now, have incorporated structured corporate financial data into its own review processes, instead of relying on manual reviews of the financial statements in Forms 10-K and 10-Q.
  • Other federal agencies should, by now, have started automatically checking the financial performance of companies as reported to the SEC before bestowing contracts or loan guarantees (among many other possible uses).

None of these things is happening on a large scale – yet. The Columbia report explains why. The Columbia report also hints at what the SEC and Congress can, and should, do about it.

 

What does the Columbia report tell us?

  • Investors are demanding structured data – not unstructured text – to track companies’ financial performance. The Columbia authors “have no doubt that [investors'] analysis of companies will continue to be based off increasing amounts of data that are structured and delivered to users in an interactive [structured] format” (p. i). “[T]here is clear demand for timely, structured, machine-readable data including information in financial reports, and … this need can be met via XBRL as long as the XBRL-tagged data can reduce the total processing costs of acquiring and proofing the data, and that the data are easily integrated (mapped) into current processes” (p. 20).
  • Nonetheless, most investors are not making any use of the structured-data financial statements that public companies are now submitting to the SEC. Fewer than ten percent of the Columbia study’s non-scientific sample of investors said they were using XBRL data downloaded directly from the SEC or from XBRL US (p. 61). Instead, most investors were getting their corporate financial information from aggregators like Bloomberg and Google Finance – some free, some not. Moreover, aggregators told Columbia that they were not using XBRL data either. Aggregators were mostly still electronically scraping the old-fashioned plain-text financial statements (which are still being filed alongside the new structured-data financial statements) and manually verifying the numbers – or paying others to do that “labor-intensive” work for them. (pp 26-27.)  
  • Two problems explain why most investors have not begun to use structured-data financial statements. First, they don’t yet trust the data. “XBRL-tagged SEC data are generally perceived by investors as unreliable,” say the Columbia authors, both because of errors in numbers and categorization and because of companies’ use of unnecessary extensions, hindering comparability (p. 28). Columbia’s review of the quality of structured-data financial statements filed with the SEC (conducted two years ago) revealed that fully 73% of filings had data quality errors (p. 32). Moreover, investors reported “a large number of seemingly unnecessary company-specific tags” (p. 21). Investors surveyed by Columbia were “especially hesitant about using the data until they are comfortable that the XBRL data matches the [plain-text] data in SEC filings” (p. 21). Aggregators, too, were holding off until accuracy and comparability improved.
  • Second, investors don’t yet have a wide range of software tools to compare and analyze structured-data financial statements. End users are also looking for easy-to-use XBRL consumption and analysis tools that do not require programming or query language knowledge. In general, these users are not willing or able to incur the significant disruption to their workflow that they perceived would be required to incorporate XBRL data without state-of-the-art consumption and analytics tools.” (p. 24)
  • If these two problems were fixed, investors could make enthusiastic and productive use of structured-data financial statements. “[T]he potential for interactive data to democratize financial information and transform transparency remains stronger than ever, and many participants, including most investors and analysts, wish that the data were useful today,” say the Columbia authors (p. 4). For instance, “virtually all investors” frequently use information that is available only in the footnotes of corporate financial statements to make their decisions – information that is now submitted and published in XBRL as part of companies’ structured-data filings (p. 48.) “With respect to the detailed-tagged footnote data, in particular, several investors and analysts have communicated to us that they view XBRL data as potentially an excellent solution to manually collecting the data they need” (p. 31).
  • Even if most investors aren’t directly using structured-data financial statements, there will be indirect benefits to investors and the markets if the SEC starts using such data for its own reviews. The study reported that “the SEC has begun to review the data to identify filer-wide, as well as individual company filing and financial reporting issues. XBRL data could significantly enhance the efficiency of the Division of Corporate Finance’s review of filings and facilitate a “red-flag” ex-ante approach to regulatory oversight.” (p. 25) “Representatives from the FASB and the SEC have both stated on the record that, in their opinions, the amount of time that it takes them to conduct their respective analyses has been reduced significantly by their use of the XBRL-tagged data (p. 26).” Even imperfectly implemented, the XBRL mandate could indirectly benefit investors and the markets by improving the SEC’s review and enforcement processes.

The SEC’s XBRL reporting requirement could deliver transformative data transparency. But it has not. So far its impact has been incremental, not transformative.

To be sure, the problems identified by the Columbia study are problems of execution, not shortcomings of XBRL itself or of the concept of structured data. Investors and the analysts serving them “would like to have the U.S. regulatory filings tagged in a structured (e.g., XBRL) format that would meet their information requirements” (p. 5). For the SEC to eliminate the XBRL reporting requirement entirely – as some filers seem to hope that it will – would be a backward move and a tragic mistake.

Nevertheless, structured data for financial statements is, without doubt, “at a critical stage in its development. Without a serious reconsideration of the technology, coupled with a focus on facile usability of the data, and value-add consumption tools, it will at best remain of marginal benefit to the target audience of both its early proponents and the SEC’s mandate—investors and analysts” (p. ii). 

 

How can these problems be fixed?

How can the SEC fix these problems of reliability and analysis and deliver transformative transparency? The Columbia report suggests four answers:

  • First, insist on accuracy and quality! The SEC does not require companies to amend their filings to correct tagging errors and unnecessary extensions. The Columbia report suggests strongly that it should. The Columbia authors fault “the reticence (or inability) of regulators and filers to ensure that the interactive filings data are accurate and correctly-tagged from day one of their release to the public and forward (or, to communicate to the market for this information that they were not insisting on this and why)” (p. 37). It is “critical” to reduce errors and extensions, either through “greater regulatory oversight and potentially requiring the audit of this data” or through third-party quality checks (pp. 42-43). The SEC’s own interests should motivate it to insist on accuracy once it becomes “serious about using the data in its Corporate Finance function and even for enforcement, as it should” (p. 43) (emphasis added). The need to improve quality might require the SEC and the Financial Accounting Standards Board to consider simplifying the underlying XBRL taxonomy (pp. i, 14, 43).
  • Second, communicate that structured data is not a supplemental feature of a regulatory filing. Rather, it is the filing! The Columbia authors explain that “the reliability of the data has been compromised by the way filers have approached their XBRL filings … [perceiving] XBRL-tagging [as] an additional task in the financial reporting documentation process rather than as a part of the internal data systems” (p. 29). The SEC framed its XBRL reporting rule as a requirement to “create an XBRL-tagged reproduction of the paper or HTML presentations of their filings” (p. 37), rather than “making individual data points available for the end user to utilize or present as they required” (p. 39). Since filers think structured-data financial statements are “incremental to their existing [plain-text] filings, they do not perceive any user need” (p. 35) – and take few pains to ensure that investors using their structured data filings get an accurate picture of their finances. “We believe this presentation-centric step hindered or diverted what should have been an important evolution from a paper presentation-centric view of financial reporting information to a far more transparent and effective data-centric one” (p. 37). One way to correct this situation would be to move to a data format that is both human-readable and machine-readable, combining the plain text and structured-data tags in a single filing. Inline XBRL would do exactly that, and in fact the SEC is considering adopting this format (n. 48).
  • Third, encourage the development of software tools that make structured-data financial statements come alive! This is something of a chicken-and-egg problem. More software tools will be created as investors demand them. But effective, lightweight, cheap XBRL analysis tools are already on offer – notably Calcbench.
  • Fourth, expand the mandate! The Columbia report is clear that investors want more regulatory information tagged and structured, not less (p. 28):

i. The data that are required by the SEC to be XBRL-tagged are all relevant in varying degrees to some subset of the investor/analyst population, but more data are required than currently mandated—e.g., earnings release, MD&A, etc.

ii. If anything, users require more, not less, types of machine-readable data to be made available, because a significant amount of information they require are not from SEC filings or financial statements.

iii. The primary focus on data in the SEC filings of annual and quarterly financial statements seriously limits the perceived ongoing usefulness and relevance of the data.

Over and over, the report points out that the SEC’s current mandate for structured data is limited to the financial statements and accompanying notes (pp. 14, 18, 21, 24, 34-35, 42). Everything else that companies must file with the SEC under the U.S. securities laws is still submitted only in plain text. These other materials – earnings releases, corporate actions, executive compensation disclosures, proxy statements, officer and director lists, management discussions – could be valuable if tagged. But they are not. Investors “view access to the full array of footnote, management discussion and analysis (MD&A), and earnings release numerical data as the main reason to consider adapting their workflow to incorporate XBRL-tagged filings” (p. 21). But this demand is “pent-up” because such items are not – yet – included in the SEC’s mandate (p. 24).

What lies ahead? 

The path forward for the SEC is clear. First, the agency must take the basic steps that are necessary to improve the quality of structured-data financial statements. Second, to tap the full potential of structured data, the agency must first stop requiring the simultaneous submission of plain-text and structured-data versions of financial statements. It should instead collect single structured-data version. That would encourage companies, analysts, and the SEC’s own staff to focus on data, not on documents. Second, data transparency requires full standardization as well as publication. Third, the agency must expand its structured-data mandate by phasing in more disclosures: earnings releases, management’s discussion and analysis, executive compensation, proxy disclosures, ownership structure, board and officer lists, insider trading reports – and, eventually, everything.

If the SEC is unwilling to act, Congress could insist. Our Coalition will call for the reintroduction, this year, of the Financial Industry Transparency Act. That bipartisan proposal, first introduced in 2010 by Reps. Darrell Issa (R-CA), Edolphus Towns (D-NY), and Spencer Bachus (R-AL), would require these steps as a matter of law.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

Cloud Expo Breaking News
The revolution that happened in the server universe over the past 15 years has resulted in an eco-system that is more open, more democratically innovative and produced better results in technically challenging dimensions like scale. The underpinnings of the revolution were common hardware, standards based APIs (ex. POSIX) and a strict adherence to layering and isolation between applications, daemons and kernel drivers/modules which allowed multiple types of development happen in parallel without hindering others. Put simply, today's server model is built on a consistent x86 platform with few surprises in its core components. A kernel abstracts away the platform, so that applications and daemons are decoupled from the hardware. In contrast, networking equipment is still stuck in the mainframe era. Today, networking equipment is a single appliance, including hardware, OS, applications and user interface come as a monolithic entity from a single vendor. Switching between different vendor'...
You use an agile process; your goal is to make your organization more agile. What about your data infrastructure? The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver on new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for application developers and slow performance for end users. Further, as data sizes grow into the Big Data realm, this problem is exacerbated and becomes even more difficult to address. A seemingly simple schema change can take hours (or more) to perform, and as requirements evolve the disconnect between existing data structures and actual needs diverge.
More and more enterprises today are doing business by opening up their data and applications through APIs. Though forward-thinking and strategic, exposing APIs also increases the surface area for potential attack by hackers. To benefit from APIs while staying secure, enterprises and security architects need to continue to develop a deep understanding about API security and how it differs from traditional web application security or mobile application security. In his session at 14th Cloud Expo, Sachin Agarwal, VP of Product Marketing and Strategy at SOA Software, will walk you through the various aspects of how an API could be potentially exploited. He will discuss the necessary best practices to secure your data and enterprise applications while continue continuing to support your business’s digital initiatives.
Cloud backup and recovery services are critical to safeguarding an organization’s data and ensuring business continuity when technical failures and outages occur. With so many choices, how do you find the right provider for your specific needs? In his session at 14th Cloud Expo, Daniel Jacobson, Technology Manager at BUMI, will outline the key factors including backup configurations, proactive monitoring, data restoration, disaster recovery drills, security, compliance and data center resources. Aside from the technical considerations, the secret sauce in identifying the best vendor is the level of focus, expertise and specialization of their engineering team and support group, and how they monitor your day-to-day backups, provide recommendations, and guide you through restores when necessary.
Cloud scalability and performance should be at the heart of every successful Internet venture. The infrastructure needs to be resilient, flexible, and fast – it’s best not to get caught thinking about architecture until the middle of an emergency, when it's too late. In his interactive, no-holds-barred session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, will dive into how to design and build-out the right cloud infrastructure.
SYS-CON Events announced today that SherWeb, a long-time leading provider of cloud services and Microsoft's 2013 World Hosting Partner of the Year, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. A worldwide hosted services leader ranking in the prestigious North American Deloitte Technology Fast 500TM, and Microsoft's 2013 World Hosting Partner of the Year, SherWeb provides competitive cloud solutions to businesses and partners around the world. Founded in 1998, SherWeb is a privately owned company headquartered in Quebec, Canada. Its service portfolio includes Microsoft Exchange, SharePoint, Lync, Dynamics CRM and more.
The world of cloud and application development is not just for the hardened developer these days. In their session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, and Harold Hannon, Sr. Software Architect at SoftLayer, will pull back the curtain of the architecture of a fun demo application purpose-built for the cloud. They will focus on demonstrating how they leveraged compute, storage, messaging, and other cloud elements hosted at SoftLayer to lower the effort and difficulty of putting together a useful application. This will be an active demonstration and review of simple command-line tools and resources, so don’t be afraid if you are not a seasoned developer.
SYS-CON Events announced today that BUMI, a premium managed service provider specializing in data backup and recovery, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. Manhattan-based BUMI (Backup My Info!) is a premium managed service provider specializing in data backup and recovery. Founded in 2002, the company’s Here, There and Everywhere data backup and recovery solutions are utilized by more than 500 businesses. BUMI clients include professional service organizations such as banking, financial, insurance, accounting, hedge funds and law firms. The company is known for its relentless passion for customer service and support, and has won numerous awards, including Customer Service Provider of the Year and 10 Best Companies to Work For.
Chief Security Officers (CSO), CIOs and IT Directors are all concerned with providing a secure environment from which their business can innovate and customers can safely consume without the fear of Distributed Denial of Service attacks. To be successful in today's hyper-connected world, the enterprise needs to leverage the capabilities of the web and be ready to innovate without fear of DDoS attacks, concerns about application security and other threats. Organizations face great risk from increasingly frequent and sophisticated attempts to render web properties unavailable, and steal intellectual property or personally identifiable information. Layered security best practices extend security beyond the data center, delivering DDoS protection and maintaining site performance in the face of fast-changing threats.
From data center to cloud to the network. In his session at 3rd SDDC Expo, Raul Martynek, CEO of Net Access, will identify the challenges facing both data center providers and enterprise IT as they relate to cross-platform automation. He will then provide insight into designing, building, securing and managing the technology as an integrated service offering. Topics covered include: High-density data center design Network (and SDN) integration and automation Cloud (and hosting) infrastructure considerations Monitoring and security Management approaches Self-service and automation
In his session at 14th Cloud Expo, David Holmes, Vice President at OutSystems, will demonstrate the immense power that lives at the intersection of mobile apps and cloud application platforms. Attendees will participate in a live demonstration – an enterprise mobile app will be built and changed before their eyes – on their own devices. David Holmes brings over 20 years of high-tech marketing leadership to OutSystems. Prior to joining OutSystems, he was VP of Global Marketing for Damballa, a leading provider of network security solutions. Previously, he was SVP of Global Marketing for Jacada where his branding and positioning expertise helped drive the company from start-up days to a $55 million initial public offering on Nasdaq.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 14th Cloud Expo, Marc Jones, Vice President of Product Innovation for SoftLayer, will explain how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
Are you interested in accelerating innovation, simplifying deployments, reducing complexity, and lowering development costs? The cloud is changing the face of application development and deployment, with enterprise-grade infrastructure and platform services making it possible for you to build and rapidly scale enterprise applications. In his session at 14th Cloud Expo, Gene Eun, Sr. Director, Oracle Cloud at Oracle, will discuss the latest solutions and strategies for application developers and enterprise IT organizations to leverage Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) to build and deploy modern business applications in the cloud.
Hybrid cloud refers to the federation of a public and private cloud environment for the purpose of extending the elastic and flexibility of compute, storage and network capabilities, in an on-demand, pay-as-you go basis. The hybrid approach allows a business to take advantage of the scalability and cost-effectiveness that a public cloud computing environment offers without exposing mission-critical applications and data to third-party vulnerabilities. Hybrid cloud environments involve complex management challenges. First, organizations struggle to maintain control over the resources that lie outside of their managed IT scope. They also need greater infrastructure visibility to help reduce maintenance costs and ensure that their company data and resources are properly handled and secured.
As more applications and services move "to the cloud" (public or on-premise), cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. At NuoDB we're involved in both building enterprise software and using enterprise cloud capabilities. In his session at 14th Cloud Expo, Seth Proctor, CTO of NuoDB, Inc., will cover experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.