Are increasing data integrations costs becoming an issue for your business? Any work around you would recommend?

29k views6 Upvotes18 Comments

Senior Director in Finance (non-banking), 10,001+ employees
Our challenge is more around the value accrued and integrity and velocity of the data with constantly evolving business model. From our perspective, approaching it with re-useable and standardizing integrations, focussed on the key areas of sales & finance or whatever is important to run the company (customer, prices, discount, commissions, revenue, margin etc.) is the path forward. If we simplify and make it low friction, then costs can be a less of a concern relatively speaking
CTO in Software, 11 - 50 employees
I'm not a fan of "work arounds", so I'll go with a solution that I've used with great success in the past, that being SnapLogic's iPaaS solution. ( This allows one to continually evolve as data sources ebb and flow.
Director Of Information Technology in Services (non-Government), 10,001+ employees
2 years ago I came into an area where almost all integrations were made with Database links. There were 100 applications each with their own DB instance and all bound by a spiderweb on interdependency. To get through I started looking at various data virtualization technologies. This would allow our systems to share data without the strong coupling of the links.
Two standout technologies I found were Snaplogic and Denodo. Both worked as if they were instant api creators without changing the underlying tech of the source and destination systems. Both have their merits. Feel free to contact me if you want some details.
Senior VP, Global CTO Hybrid IT in Software, 10,001+ employees
There are a number of data integration platforms out there depending on your data sources including platforms like Amazon Redshift. I would start with the end state of what you're looking to do with the data and work backwards. Several of the ISV's in the virtualization space are building data integration including governance into their toolset. I wouldn't be afraid to also build some tools internally  - if you have the capability - as to understand the value of the end state data and the often changing business requirements mean you may want to have a look around the corner a bit before a commitment.
Chief Information Officer in Finance (non-banking), 1,001 - 5,000 employees
There are many integration as a service platforms for both on-premise and cloud based data sets. It really depends on level of integration and abstraction required for a specific business outcome. It is also important to understand the data lineage and future purpose before deciding on the integration strategy. 
Group CIO in Energy and Utilities, 1,001 - 5,000 employees
Why is this question asked by an Anonymous  User ? Sorry... Don't talk to anonymous people....
Chief Information Officer in Manufacturing, 10,001+ employees
I never use work around, because in the end they will fail. Do it right the first time. I am also concerned about data integrity if using cloud service providers as opposed to on-prem platforms. If you decide to use a third party for data integration be sure your contract is written to allow you to recover that data if your relationship with them is severed. The cost can be an issue but the better question is what is the cost to the business if it is not done correctly?
CEO, Self-employed
I think it mostly depends on business type. If the business is primary type and directly communicates with the clients, then the cost is not high enough compared to the profit return from the utilisation of that data...
Division VP, IT, Self-employed
I would recommend a longer-term architectural approach like leveraging an integration framework. I have always put in place some flavor of ESB (enterprise service bus) like IBM Boomi, Mulesoft ESB, Apache Camel, etc. whenever I go into any company because it strategically provides several things: 1) architecture for integrations that will scale (these frameworks treat integrations like applications and you can cluster/scale them like you would any other application type) 2) any-to-any endpoint support, 3) decreases cost of integrations (because over time, you end of re-developing when needed typically the outbound part of the integration, for example when replacing out one system with another, 4) self-documenting (IBM Boomi for example will document the integrations and data pipelines within the tool for no extra effort). There are more advantages but that's a good start. For decreasing cost of data processing, storage, I would recommend looking at what data types you use for storage (most of the cloud vendors have options for longer term storage types that are cheaper than real-time application disk/memory usage). Cursory, but the above will give hopefully provide some other areas to consider. 
COO and Partner, Life Sciences in Software, 201 - 500 employees
If you are increasing costs as a result of acquiring new data sources that are valuable - providing raw material into data science/etc that drive business value - then I'm not sure the increasing costs alone are reason for concern.  ROI on those costs is a better way to look at it.

But tools like Denodo can eliminate movement of data and provide analytics in-place, building upon the "virtual" data warehouse concept.  Performance trade-offs need to be evaluated when these architectures are implemented in self-service analytics.

Content you might like

Cost structure26%

Lack of in-house skills to migrate / deploy / manage workloads on cloud51%

Security / governance compliance concerns18%

Lack of performance or features that you have on-prem but not the cloud4%


3.2k views1 Comment

CTO in Software, 201 - 500 employees
Without a doubt - Technical Debt! It's a ball and chain that creates an ever increasing drag on any organization, stifles innovation, and prevents transformation.
Read More Comments
45k views132 Upvotes322 Comments

<6 months8%

6-12 months55%

12-18 months24%

18-24 months6%

>24 months5%


2.9k views5 Upvotes2 Comments