Tuesday, May 14, 2019

DBSync Replication tool: Key functions


A brief explanation of its functionalities and uses
Replication is about identifying changes and updating other data sources. Its applications are many. From data warehousing to business intelligence and compliance.
Our replication tool has many useful functionalities that can help you achieve those goals based on your business strategies. Thus, a good understanding of their scopes and differences is vital to obtain the maximum benefit from the tool.
In this article, you will learn why replication is essential by analyzing several possible use scenarios. Following this, you will understand the main features of the software and their differences, and the different ways you can run the tool.
So, let’s start by understanding where replication has a place in your business.
Why is replication important?
Replication is an integral part of your daily business tasks because by identifying changes, it keeps your data up to date in all data sets. This can be easily seen in the several potential use cases of our DBSync Replication tool.  These possible uses are:
  • Staging a database for data warehousing: our replication tool can be used to create the schema for a data warehouse. Also, once the warehouse structure has been defined, our tool can be used to populate it.
  • Create datasets for business intelligence and reporting: business intelligence and reporting need specialized data. Our tool helps you to create tailored datasets, which can be used as a data feed for specialized applications and reports
  • Create backups and archives: backups and archives are part of the life of the DBA. Our tool helps your business by automating many of the functions, thus, freeing the DBA’s time to do more valuable tasks. Archives are also an essential part of regulatory compliance. Automating data archiving according to your business characteristics is one of the best ways to ensure compliance.
  • Integrate Salesforce with in-house applications: Salesforce is a powerful application. However, when working in combination with other apps, it is even more powerful. DBSync replication tool can create the necessary data sets that feed those apps. Even more, once the data has been copied into a database, it can be combined with data from other sources, massaged and then, incorporated back into Salesforce.
  • Create a data set for in-house application development: app development needs data for many of its phases, such as testing. By using our tool, your developers can create tailored and updated data sets, with the added advantage that your production data remains untouched.
Having seen some examples that explain why replication is a must in your business, let’s see the main characteristics of the DBSync Replication tool, which define its applicability in your industry.
What are the main characteristics of the DBSync Replication tool?
The DBSync Replication tool has several outstanding technical aspects that define its scope and applicability. The most important ones are:
  • Auto-creation of new fields: when a new field is added to the Salesforce schema, it is automatically incorporated in the database structure. This ensures that your structure remains updated in your backups and other data sets, and by being automated, frees the task from the DBA list of “to do.”
  • Downloading of objects/tables into a database: our tool allows you to download Salesforce objects/tables into a database of your choice.
  • Uploading of data from a local database to Salesforce: you can upload data from a local database to Salesforce.
  • Incremental updates: our tool automatically tracks any data changes. Thus, you never miss any changes done, thereby, reducing the number of errors and freeing up the time of your DBA for other more important tasks.
  • Migration of non-replicable objects: non-replicable objects are those entities whose changes cannot be tracked over time. Examples in Salesforce are LeadShareAccountShare, and OpportunityShare. DBSync replication tool migrates them by copying the complete objects. This option is available in the Database Details section of the program.
  • Real-time integration using outbound messages: by using Salesforce’s outbound messaging capability, our integration tool helps you to have practically real-time integration.
Having understood the main technical aspects of our tool, let’s learn the main ways in which you can to use it.
How can I use the DBSync Replication tool?
Using the DBSync replication tool is straightforward. The system guides you step by step. First, you need to define your source and destination information. For example, you will have to input the credentials for your database and Salesforce.
Once your source and destination information are complete, you can run the replication tool. There are six-run options available, which are:
  • Clean copy: a Clean Copy is a copy of all the data available in a Salesforce instance. In other words, it copies all existing records in Salesforce to a database of your choice. The clean copy can be scheduled via the Scheduler, and thus, used as a backup tool.
  • Update schema: updates only the schema or structure of a database. Thus, it copies the tables and their columns with their respective names existing in the source database to the destination database. No data is copied from the source to the destination. Update schema comes in handy when we want to create a new instance of a database.
  • Source to DB (database): provides an incremental backup. In other words, it copies the records that are in the source (for example, Salesforce), but don’t exist in the database, and updates the database with those records that have been changed in Salesforce. This option can be used as a complement of the Clean Copy option, in a backup tool. For example, a backup strategy could be a clean copy scheduled every Friday night, and incremental backups scheduled daily.
  • DB (database) to Source: this option is similar to the previous one but in a reversed flow. In other words, in the source (for example, Salesforce), it creates the records that exit in the database but not in the source, and updates those that have been changed in the database, but not in the source.
  • Snapshot: a snapshot is a copy of the whole Salesforce database, which can be restored. Its name includes a time reference. Thus, snapshots become very useful for legal compliance or proof in case of litigation, as each of them is a copy of the data contained in Salesforce, at a point in time.
  • Export: allows the user to connect two different instances of Salesforce. In other words, to transfer data from a Salesforce instance to another separate Salesforce instance. This option is useful when you need to create a new Salesforce instance based on another one.
Summarizing
In this article, you have seen why replication is vital to your business and the main characteristics of our DBSync Replication tool. With the growth of data that characterizes our present times and the need for up to date information, the use of replication continues to grow, as it does the number of applications of our tool.
Ready to learn more? Go to our page and try our software, or contact one of our representatives. We are there to serve you and make your business grow.

Monday, May 13, 2019

Force of a Godzilla – Salesforce and JIRA


The name “JIRA” is inherited from the Japanese word “Gojira” which means “Godzilla.” When you combine it with Salesforce, you get “The Force of a Godzilla”!
Ideal CRM and Helpdesk Setup
We all are aware that Salesforce is world’s #1 CRM used across industries for managing customer data and is widely used by various departments like marketing, sales, project management, accounting, operations, support, and services. It instantly streamlines and automates your business processes bringing in incredible simplicity, efficiency, visibility and integration and proves to be a support to achieve higher customer satisfaction without any infrastructure to buy, setup or manage. That is the beauty of this cloud-based CRM application.
As mentioned above, Salesforce can manage the entire sales cycle, right from account creation by injecting leads into the system, creating relevant opportunities, to timely reminders for follow up and eventually closing the deal. With the ability to stage each of these milestones as a prospect, proposal, invoicing, won or lost, it can be easily customized as per your sales process requirements. In order to yield maximum benefits and become more efficient in handling data, Salesforce has integration capabilities with other systems like database, ERP, accounting, warehouse management, chat, LMS, helpdesk and more, which proves to be ideal for any organization in automating the IT ecosystem saving vast number of resource hours and investment.
Similarly, a well-known system called JIRA, mainly used by organizations for issue tracking, bug tracking, and agile project management, apart from its main functionalities, offers integration options with other major systems used by organizations to share and manage data efficiently.
In a typical customer service environment, it is vital to have a ticketing system in place, to log, track and resolve client issues. It enables organizations to understand their clients better, know the problem areas, and take the product or services in the right direction. JIRA Service Desk proves to be the ultimate resolution where incoming client requests are accepted and converted into a case, which is then assigned to a resource to work on.
The biggest reason why companies opt for JIRA with Salesforce is that they can work well hand in hand, where Salesforce is used for client management and JIRA for case management and project tracking. This proves to be a powerful combination for any organization which requires a straight and sorted issue tracking process to work, in association with a CRM that holds all client records. We can say that the Japanese correctly named JIRA after the name “GOJIRA’ which means Godzilla which when combined with Salesforce, becomes “The force of a Godzilla.”
Now that’s a POWERHOUSE! 
Well, after reading this, a fundamental question might have popped in our minds –
How do we know that our organization needs integration between Salesforce and JIRA? What are the factors to consider even to conclude that we should have an integration in place?
Here are some indicators:
  1. You have a resource dedicated to ticket creation, resource alignment, prioritization, and SLA follow up and you are dependent on him to initiate the process
  2. You have people entering the same data manually in Salesforce and JIRA
  3. Your resources are spending more time entering data than doing the actual job
  4. You have duplicate records or bad data
  5. Your SLAs are not met for issue resolution
  6. Low resource efficiency
  7. You witness data mismatch in project, issues or bug tracking which is creating confusion between teams/resources
  8. You are losing clients due to delayed, incorrect or unresolved issues logged
  9. Inaccurate data analysis and forecast due to incorrect or varying data  
JIRA is well integrated with Atlassian suites like Confluence etc., which makes self-service and other knowledge article databases seamless.
Now that we know we need integration in place between the two systems, what should we look for in integration to best serve our organization?
  1. Connector with prebuilt templates
  2. Correct and seamless field mapping
  3. Fully compatible with your Salesforce and JIRA versions
  4. Auto-sync functionality with as low as 3 minutes or even real-time
  5. Ability to make the integration bi-directional
  6. Provides error logs and notifications
  7. Ability to connect cloud to cloud or cloud to on-premise applications
  8. A simple interface and ability to add new connectors in no time
  9. Cost effective, i.e. higher ROI
Accurate field mapping is one of the most critical functionalities an integration tool should offer you. Let’s look at standard field mappings provided by DBSync which are ideal and are preferred by most of the organizations using Salesforce and JIRA.
Salesforces and JIRA Service Desk – JIRA Service Desk Issues to Salesforce Cases
JIRA Service Desk & Salesforce unidirectional standard template helps you integrate data between JIRA Service Desk and Salesforce. It enables you to streamline your data, automate business processes and ensure data accuracy between JIRA Service Desk and Salesforce to eliminate redundancy of data entry. 
With DBSync integration, you can construct custom mappings for this template. DBSync fully supports the creation of custom mappings as per your business model and business process.
This template supports almost all major versions of Salesforce: Salesforce Unlimited Edition, Salesforce Professional Edition, Enterprise Edition, and the Non-Profit Edition.
It supports JIRA Service Desk, JIRA Core and Service Desk Cloud or On-premise versions 
Salesforce and JIRA Core – Jira Core Issues to Salesforce Cases
The template is easy to integrate JIRA Core projects and Salesforce Cases. It provides Bidirectional integration capabilities with pre-built field to field mappings along with flexibility for more complex and dynamic mapping options.
This template supports JIRA Software 7.3.1 and above
Conclusion
Integrating the two systems, Salesforce and JIRA offers complete transparency in communication and supports cross – teamwork, making it more effective and productive. With the help of DBSync integration platform, it becomes easier for organizations to manage and track issues on both the systems with high levels of accuracy leading to higher customer satisfaction and organizational improvement.
DBSync standard solution and dynamic platform can help you integrate Salesforce and JIRA Service Desk almost in real time. In case of any specific needs, it can offer you a tailor-made solution just by leveraging the DBSync iPaaS and JIRA set of connectors. Please reach out to sales@mydbsync.com or call 1-877-739-2818 to get more information. Alternatively, reach out to your Account Manager if you already are a DBSync user. 

Friday, May 10, 2019

5 reasons why customers should have Data Integration


1. Take advantage of specialized applications
If I can say in my words, every piece of software that works with data represents analyses and transforms information in a specialized way. By integrating data into a format accepted by that application, you’re giving yourself the power to open and use your data in that software.
For Example, In DBSync we are using different types of connectors like Salesforce, Microsoft CRM Dynamics, NetSuite (ERP), QuickBooks and many more.
With the help of these connectors, we are Integrating the data from different sources to different destination platforms.
2. Reduce technical complexity and improve business agility
In present time Data Integration architecture enables organizations to eliminate point to point Integration, Reduce Complexity and improve overall optimization while providing self-service access for distributed teams.
3. Make Data More Available
Centralizing your data makes it easy for anyone at your company (or outside of your company, depending on your goals) to retrieve, inspect, and analyze it.
Easily accessible data means easily transformed data. People will be more likely to integrate the data into their projects, share the results, and keep the data up to date.
4. Managing CRM Applications
Customer data integration is essential for the success of CRM. You need to have accurate customer details (the single version of the truth). This is important so that all the customer service employees (from those who call the customer to those who process the application) have a clear view of the customer and see the same data. Organizations are mainly utilizing customer data integration for customer retention by having a 360-degree view of them. Their next step would be using customer data integration to get new customers.
5. Data Integrity
According to me many things will happen with your data from the day it is created in your system and throughout its lifecycle. It can be transferred to other systems, altered and updated multiple times. Human interactions, data transfers, viruses or compromised hardware are all events that can compromise your data’s integrity. This is where maintaining data integrity can become a tricky task.
In my job providing data integration solutions to organizations, data quality, and data integrity are always something we discuss in details. I believe that data integrity should be top of your mind at every stage of the data lifecycle, and already from the design and implementation phase of your systems. Data integrity also ensures recoverability and searchability, traceability and connectivity. Protecting the validity and accuracy of data also increases the stability and performance of your systems.
Reasons why to go for DBSync
DBSync is a proven integration platform which facilitates integration between various legacy systems, services, processes, business partners and data to provide new business value and improve new business performance. DBSync manages all communication, data translation, and process interaction among various connected applications. If I talk about the pricing, we have an excellent plan for our clients, and they can also choose the best package what they like based on their needs and requirements. Please check all the prices with all subscription plans at DBSync.
Best in class support and service plans
We are providing videos to the customers to help them in Implementation, to training documentation that explains, to a community platform, to one-on-one support from partners who can speed up your integration.
DBSync ensures you have everything required to implement and run your integration project deployment successfully.

Providing standard support through email or online are included as a part of the yearly subscription. 
Check out more.
Easily Customizable Integration by using iPaaS
The DBSync iPaas is the fastest way to integrate between two applications with its existing pre-defined solutions. It allows companies to connect “Cloud to Cloud” or “Cloud to On-Premise” applications quickly to solve any integration problem. You can also use it to streamline various known business processes like an order to cash, procure to pay, service to cash, payment integration or integrate your cloud applications with your database.
DBSync provides Development studio that would jump start your cloud adoption with pre-built templates and easily customizable integrations, API’s and extend the platform so that any non-technical users can quickly build processes that connect with an unlimited number of Endpoints. Check out more.
Support Standard and Custom Connectors for Integration
Connectors or adapter which are sometimes referred to from an integration standpoint as a set of components within the iPaas are that opens a connection to an external system that is used to authenticate, push and pull data from external systems.
Connectors in DBSync may be categorized into two types
  • Standard Connectors
The standard connector presents a published, constant Interface (API) to users and other components on the upstream side of the connector. Please check the below link to see how many standard connectors we are supporting.
  • Custom Connectors
DBSync provides SDK for building custom connectors. Most of the times we are getting requirements from the customers to create the custom connectors for them to transfer/ Integrate a Data from One application to another application like Salesforce to QuickBooks and many more. Check out more.
Tutorials
The tutorial is a method of transferring knowledge and may be used as a part of a learning process. DBSync tutorials are used for knowledge transfer.
Most of the time customers are using different tools for Data Integrations, but they don’t know how to use these tools for data integration. So, here DBSync is providing a Tutorial section for the users so users can come and see how they will be going to Integrate their data with the help of DBSync from Source to Destination applications like Salesforce to QuickBooks and many more. Tutorials provide all the information to the user about the queries builder and hierarchy mapping for the data flow. 
Check out more.
Template Library
Solution Library is a collection of pre-built integration solutions. These solutions can be easily added to your integration instance.
It helps the customers to get all the data about how to Integrate from source to target applications. With the help of Template Library customers will get to know about the information of Data Flow, Field Mapping, Process Map, Prerequisites for Integration and how the user will get to start with DBSync includes all the sequential points. Check out more.

Wednesday, February 20, 2019

Describing Enterprise Artificial Intelligence: Platform as a Service With Current AI Infrastructure



If I will talk about the Enterprise AI, then it is hard to think of an application that doesn’t use a database. If you see from mobile to web to the desktop, every modern application relies on some of a database. Some apps use flat files while others rely on memory or NoSQL database.

If I will talk about the traditional enterprise applications, then they interact with large database clusters running Microsoft SQL, Oracle etc. The fact is that every application needs it.

Like databases, Artificial Intelligence (AI) is moving towards becoming a core component of modern applications. In the coming months, almost every application that we use will depend on some form of AI.

Enterprise AI simulates the cognitive functions of the human mind — learning, reasoning, perception, planning, problem solving and self-correction — with computer systems. Enterprise AI is part of a range of business applications, such as expert systems, speech recognition and image recognition.

Figure 1


1. Start Consuming Artificial Intelligence APIs

This approach is the least disruptive way of getting started with AI. Many existing applications can turn intelligent through the integration with language understanding, image pattern recognition, text to speech, speech to text, natural language processing, and video search API.

Let’s look at a concrete example of analysing the customer sentiment in a customer product requirement demo session. Almost all the Customer calls to the service team are recorded for random sampling.

A supervisor routinely listens to the calls to assess the quality and the overall satisfaction level of customers. But this analysis is done only on a small subset of all the calls received by the customers to the service team. This use case is an excellent candidate for AI APIs. Each recorded call can be first converted into text, which is then sent to a sentiment analysis API, which will ultimately return a score that directly represents the customer satisfaction level.

The best thing is that the process only takes a few seconds for analysing each call, which means that the supervisor now has visibility into the quality of all the calls in near real-time. This approach enables the company to quickly escalate incidents to tackle unhappy customers and rude customer service agents.From CRM to finance to manufacturing domains, customers will tremendously benefit from the integration of AI. There are multiple AI platforms and API providers like (With link):


2. Build and Deploy custom AI models in the Cloud

While consuming APIs is a great start for AI, it is often limiting for enterprises.

We have seen the benefits of Integrating Artificial Intelligence with applications, customers will be ready to take it to the next level.

This step includes acquiring data from a variety of existing sources and implementing a custom machine learning model. It requires creating data processing pipelines, identifying the right algorithms, training and testing machine learning models and finally deploying them in production.

Similar to Platform as a Service that takes the code and scales it in the production environment, Machine learning as a service offerings take the data and expose the final model as an API endpoint. The benefit of this deployment pattern lies in making use of the cloud infrastructure for training and testing the models. Customers will be able to spin up infrastructure powered by advanced hardware configuration based on GPUs and FPGAs.

Platforms that offers Machine Learning as a Service:

3. Run Open Source AI Platforms On-Premises

The final step in AI-enabling applications is to invest in the infrastructure and teams required to generate and run the models locally. This is for enterprise applications with a high degree of customization and for those customers who need to comply with policies related to data confidentiality.

If ML as a Service (MLaaS) is similar to PaaS, and running AI infrastructure locally then it is comparable to a Private Cloud. Customers need to invest in modern hardware based on SSDs and GPUs designed for parallel processing of data. They also need expert data scientists who can build highly customized models based on open source frameworks. The biggest advantage of this approach is that everything runs in-house. From data acquisition to real-time analytics, the entire pipeline stays close to the applications. But the flipside is in the OPEX and the need for experienced data scientists.

Customers implementing the AI infrastructure use one of the below open source platforms for Machine Learning and Deep Learning:

If you want to get started with AI, explore the APIs first before moving to the next step. For developers, the hosted MLaaS offerings may be a good start.Artificial Intelligence is evolving to become a core building block of contemporary applications. AI is all set to become as common as databases. It’s time for organizations to create the roadmap for building intelligent applications.

AI Data evolutions like Data Processing and Neural Networks.

Now in present time we are feeding loads of data to the computer, so the computer will learn about Deep learning technologies and the reason for behind this to take AI Initiative.

Neural networks process information in a similar way the human brain does. The network is composed of a large number of highly interconnected processing elements (neurons) working in parallel to solve a specific problem. Neural networks learn by example to solve complex signal processing and pattern recognition problems, including speech-to-text transcription, handwriting recognition and facial recognition.

Data processing is, generally, “the collection and manipulation of items of data to produce meaningful information.” In this sense it can be considered a subset of information processing, “the change (processing) of information in any manner detectable by an observer.”

AI data processing is the need for high-quality data. While data quality has always been important, it’s arguably more vital than ever with AI initiatives.

In 2017, research firm Gartner described a few of the myths related to AI. One is that AI is a single entity that companies can buy. In reality, it’s a collection of technologies used in applications, systems and solutions to add specific functional capabilities, and it requires a robust AI infrastructure.

Another myth is that every organization needs an AI strategy or a chief AI officer. The fact is, although enterprise AI technologies will become pervasive and increase in capabilities in the near future, companies should focus on business results that these technologies can enhance.

Conclusion

In, the Conclusion I would say everyone has to use the Artificial Intelligence Applications and as we found that many existing applications can turn intelligent through the integration with language understanding, image pattern recognition, text to speech, speech to text, NLP and video search API.

As we have seen earlier a supervisor can do the Random Sampling at the time of Customer Product Requirement demo session. So, what supervisor wants to listen in the last about the customer satisfaction and how better we can use the AI APIs to convert the call into text for the sentiment analysis. As, I have mentioned above in this use case we can definitely find out the score directly what is the satisfaction level of customer.

So, with the help of AI APIs we can easily understand the customer problems and give a better product to them for their use.

As, I have mentioned above for the same that we have to build and deploy the custom AI models in the cloud and why it is more useful and beneficial for the customers.

So, if I will define why is it more useful and beneficial as we have seen above the AI APIs will take to the customer to the next level. AI APIs will acquires data from variety of sources and help in to create data processing pipelines and identifying the right algorithm and deploy in production environment.

So, in the last I would say use the AI APIs applications to reduce the pain of customer and help them to reduce the complexity and make the process more effective with the help of AI APIs.

Monday, February 18, 2019

Odata and REST APIs - A Comparison



Are you curious about what is REST and OData? Then, this article is for you. In it, you will find an explanation of what is REST, how it is applied to the construction of RESTful APIs; what is OData and how it is used to build web services, and the differences between the two.

Brief comparison

The table below summarizes the main similarities and differences between REST APIs and OData Services.



Table1: REST and OData comparison

As the table shows, both technologies follow REST principles, although OData is more relaxed in that, if there is a good reason to avoid its use, it lets you do so.

Thus the OData Protocol is more comprehensive than the REST model, as in addition to the use of the REST principles, it provides a way to describe both the data and the data model.

So, let’s go deep into the details by explaining each of the concepts mentioned in the above table.

What is REST

Representational State Transfer, or better known as REST is a software architecture style, which defines a set of principles useful for creating efficient web services.

These principles were described by Roy Fielding in his doctoral dissertation written in 2000. As he stated during an interview:

“…That process honed my model down to a core set of principles, properties, and constraints that are now called REST”

In his dissertation Fielding identified six aspects of HTTP and HTML that made these technologies successful and efficient. These concepts or principles are:

  1. Client–server
  2. Stateless
  3. Cacheable
  4. Layered system
  5. Code on demand (optional)
  6. Uniform interface
    • Identification of resources
    • Manipulation of resources through these representations
    • Self-descriptive messages
    • Hypermedia as the engine of application state (HATEOAS)

The HTTP protocol is an example of a system that implements the principles of REST.

What is a REST API

A REST API is simply an Application Programming Interface that uses the above mentioned principles. Usually, they use XML or JSON to communicate, although the REST architecture doesn’t ask anything specific in this regard. Even more, REST doesn’t demand any particular format, and accepts any format that can be used via Content negotiation.



Figure 1: REST API

What is OData

OData or Open Data protocol is an application-level protocol that describes a way for interacting with data using RESTful services. At present it is in version 4.0. This latest version was standardized by OASIS and approved as an ISO/IEC International Standard.

Initially created by Microsoft in 2007, it was later released under the Microsoft Open Specification Promise, and thus made available to all.

The protocol is based on two main components: 
  1. Industry standards: HTTP, XML, Atom and JSON.
  2. REST-based architecture: HTTP protocol.

However, the specification states that:

  1. OData must “follow REST principles unless there is a good and specific reason not to”.
  2. OData Services MUST support the ATOM encoding.
  3. OData services SHOULD support a JSON encoding.

Figure 2 shows the structure of an OData compliant web service. It rests on HTML technology, which somehow resolves the problem of being REST-based. In addition, it must include Atom encoding, which is based on the XML technology, and also adhere to the OData’s data handling specifications. Since version 4.0, OData also handles JSON.



Figure 2: an OData compliant construct

OData uses Atom or JSON to define the data schema, and it provides two important models for the management of data. They are: 
  1. Entity Data Model or EDM: it is an abstract data model that MUST be used to describe the data exposed by the service.
  2. Service Model: it consist of two static resources and a set of dynamic resources. The static resources provide a way to ask a service about its data model, and the dynamic ones offer methods to manage the data model.

The two static resources are: 
  1. The metadata document, which describes the data model.
  2. The service document, which lists all of the top-level entity sets exposed by the service.

An important feature of the OData protocol is its support for a versioning scheme that enables services to be backward compatible.

In addition, there are several libraries that can be used to facilitate building and consuming OData-based services. Among them, Restier is the main library for the .NET framework, and Apache Olingo is the most important for the Java platform.

Conclusion

In this article we have seen what REST means, how it is used in RESTful APIs and in the OData Protocol. In addition, we have described the OData protocol’s main components.

We have concluded that REST is part of OData, and also more general. Thus, OData can be considered a safer guide to creating RESTful services.

Read more related articles at our blog series

Friday, February 15, 2019

How Integration platform is driving the 4th industrial revolution in 2019



The first Industrial revolution paved way for a new energy source – steam, which pushed industries towards mechanization. It helped industries to improve productivity.

The Second Industrial revolution introduced electricity and other innovations in infrastructure. It created new industries and opportunities to thrive on mass production lines.

When we arrived at the Third revolution during the late twentieth century, information technology was the major change agents towards the automation of production processes.

And here we are now, the next big thing in the history of the industrial revolution – The Fourth one. It is getting much attention and hailed as important a milestone as the first one. Let’s have a look at why is that?

Fourth Industrial Revolution (4IR) is all about connectedness and how we can embed the technology into our societies and industries. All emerging technology breakthroughs will speed up the digitization of industries. Cloud technologies and 5G will be the front runners of this. A few examples of emerging fields are Robotics, Artificial Intelligence, The Internet of Things (IoT), where such innovation is happening.

4IR or Industry 4.0 is not about inventing technology for technology’s sake. The diagram below explains how 4IR is giving way to enhance connectedness and productivity and some of the social issues.


4th Industrial Revolution
  
How Integration platform will be driving the revolution

This is the age of disruptive technology. It is creating a level playing field for businesses with innovative and agile business models. Whether it is an enterprise and small business, constant innovation and integration of the business processes has become evident.

Platform with No-code/ Low-code capabilities

With No-code and Low-code development platforms, businesses can develop a software application without the need to write code. These platforms use visual modeling approach with pre-built functional components available in the platform library. The developers can just select the components and add to the workflow in line with the application. This helps developers to achieve the results with greater efficiency.

This approach also makes the progress visible to all level of people in the business. Also saves businesses a lot of money as they do not need to maintain the stand-up environment and infrastructure. The platforms pretty much offer all in the package. Let’s take a look at how No-code and Low-code development capabilities work;


No-code and Low-code


No-code development capabilities

No-code development capabilities provide the business users to solve basic functional problems between the departments with simple solutions. These platforms will be more suitable to the SMBs to solve their integration challenges without hiring a professional developer as there is no need to write a code to implement such applications.

Low-code development capabilities

Low-code capabilities cater to Enterprises that have technology governance requirements. It uses a more synchronized approach with scalable architectures and flexibility in terms of on-prem or cloud implementation. Low-code platforms also extend the capabilities with the use of open APIs. It supports complex integrations using comprehensive components in the platform library and ability to incorporate all the innovative next-generation technologies and third-party services available via open source.

This platform can create sophisticated applications for enterprises that encompasses various departments and domains. It offers better control to the developers for quality testing and performance tools which results in high productivity and speedy deployment.

API based ecosystem

With the choices and capabilities, the integration platform offers, there is no denying the power of APIs that helps these integrations possible.


API-Integration

As businesses grow bigger, requirements also grow wider. Enterprises need applications to work across departments and domains. They need to incorporate next-gen technologies to increase productivity. Then there is regulatory compliance and technology governance.

Also, there are challenges of modernizing the legacy systems to co-exist with Cloud infrastructure.

APIs act as an intermediary between on-prem and cloud, so all the other applications remain neutral to the platform they are installed on and maintain access to the data and other services as they move to the cloud.

Integration Platforms and APIs work hand in hand to help businesses integrate applications and manage data with faster deployment and higher efficiency. It saves the organization time, money, and needless to say, the errors generating from the complex coding. These platforms can drive the revolution by speeding up the productivity to many folds.

Wednesday, February 13, 2019

Handling EDI using Cloud Workflow



Electronic Data Interchange or EDI has existed since the early 70s. Today, many IT giants, such as IBM, are still using it.  One of the many reasons is its practicality, as this data format offers a way to transfer commercial documents in an easy and fast manner.

In this article, you will learn about the benefits of EDI, and how to create a collective digital model using EDI files and DBSync’s Cloud Workflow.


Benefits of EDI

EDI has been in the market for a long time. One of the reasons is its benefits.

“EDI continues to prove its major business value by lowering costs, improving speed, accuracy and business efficiency. The greatest EDI benefits often come at the strategic business level.”

EDI Basics

EDI’s most important benefits can be summarized as:



Table 1: Benefits of EDI

These benefits can be capitalized in many business cases. One of the most important of them is Supply Chain Integration.


Use Case: Supply Chain Integration

Today businesses operate in a multi-enterprise environment. An important aspect of this ecosystem is the value added through the integration of partners and customer communities.

Collaborative digital models can deliver great things, such as supplying manufacturing lead times, product availability data, demand forecasts from end customers, and more.

This data availability could, for example, allow you to consider precise inventory levels across the supply chain, based on demand data, prices and market calculations.

Thus, a B2B digital chain could benefit from EDI, and made possible with DBSync Cloud Workflow: EDI being used to describe the documents, and Cloud Workflow to execute the document transfer via FTP servers.

Once the EDI file has been transferred, a third application, usually called an EDI convertor,  is necessary in order to interpret the code contained in the EDI file, and transform it into a proper document.


Figure 2: Supply Chain Integration


How Cloud Workflow Manages the Document Transfer

There are two different operations with an EDI document. First, an EDI file can be transferred between different points, via, for example,  FTP servers. Second, the EDI document needs to be parsed and transformed into the actual document.

Cloud Workflow can easily transfer text documents stored in an FTP server to another data source, including another FTP server. The files are handled via a separator, in a similar manner to CSV files.

This feature allows Cloud Workflow to transfer EDI files. Although EDI files are defined by many different standards, such as ANSI, EDIFAC and TRADACOM, the common characteristic is that they can be transferred via a separator.

However, it must be noted that Cloud Workflow doesn’t parses the EDI document. It only transfers the text file containing the EDI code.



Figure 3: EDI transferred via Cloud Workflow

In order to understand how Cloud Workflow transfers EDI documents between FTP servers, let’s take a look at the next figure, which shows the basics of an ANSI EDI document definition.

An EDI document starts with a “Start of transaction” indicator, and ends with an “End of transaction” definition. In the middle, the contents of the document are defined.

In EDI terms, a single document is called a “transaction set” or “message”. A document can be something like an invoice, a purchase order or another commercial document. A transaction set is comprised of data elements, segments and envelopes.

Data elements are the individual items of the document, and are contained in segments, separated by a data element separator. In the example shown in the figure the separator is a star or *.

Finally, a document is stored in an envelope, which can contain one or more documents. Envelopes are then transferred between sites.

By using this EDI file structure, Cloud Workflow can transfer EDI documents located in FTP servers in a similar way it transfers CSV files.




Figure 4: Structure of an EDI document

Wrapping Up

We have seen that DBSync’s Cloud Workflow can easily transfer EDI documents stored in FTP servers. The next step is to convert the EDI code into a proper file.

This feature opens immense possibilities. One very important option is its use in Supply Chain Integration, where a chain of FTP servers could serve as a link between different points in the supply chain.

The opportunities opened by this collective digital model are many: from production based on customer data, to inventory planning, reduction of lead times, and more.

Would you like to learn more about Cloud Workflow and its powerful features? Visit our website at https://www.mydbsync.com/, where you can try our app for free.