Wednesday, July 18, 2018

Discover real time integration with DBSync’s salesforce streaming connector



Have you ever felt the power of speed? That is exactly what one new Salesforce Streaming Connector brings you: real time, made real.
The connector is based on Salesforce Streaming API, which makes use of a technology that transmits data in a practically continuous manner, without waiting for the whole data set to arrive, enabling you to manipulate the data almost immediately after arrival.
Moreover, the Salesforce Streaming API helps you to focus your data transfer on the relevant data only, as it follows a publisher-subscriber model. In it, users subscribe to channels that broadcast specific data changes defined via SOQL queries. Thus, for example, a channel can be designed to inform changes in email addresses available on clients’ accounts.

What are the Benefits

An architecture with a focused real-time processing can have a huge impact on how your business functions, because you have knowledge as transactions take place.
With continuous data processing, a small amount of transactions is processed every time, making the data available as it arrives.
Figure 1
The benefits are plenty: data access is boosted, the necessary resources are reduced, and uptime is improved. These advantages can have an enormous effect on your business, by helping you outperform your competitors and increase your profits.

When to Use Real Time Integration

Continuous data processing can have a serious impact on several areas of your business. Three important ones are: an improved customer service, vendor independence and error reduction.

Improved Customer Service

Having continuous information feeds can influence the way customers respond to incentives. Their responses can be analyzed almost instantly, and incentives adjusted accordingly. Inventories can be updated timely and production efforts reduced. Customers can have receipts sent to them immediately after purchasing, and enjoy the feeling of almost instant gratification.

Vendor Independence

The capacity to continuously transmit data between different applications, frees you from being dependent on a single vendor. In this manner, you can choose the systems that you believe are the best for each different need.

Error Reduction

Continuous data updates ensure that the correct data is always available. Changes in contact information, banking details and other data is immediately reflected across your integrated systems, reducing inconsistencies and ensuring error-free timely responses.

So, why is this Technology so Efficient?

The answer is simple: the foundations behind it. The Salesforce Streaming API uses several proven technologies that combined, optimize the data transfer.
The asynchronous communication is based on the widely accepted Bayeux protocol and Ajax methodology, ensuring a persistent client-server connection, where events are incrementally handled and interpreted on the client side immediately on arrival.
Long polling or Comet Programming minimizes the amount of messages sent, by waiting until data is available to be delivered, eliminating no-data messages.
Adding to these technologies, DBSync user friendly interface ensures that the integration process is done in a fast and easy manner, permitting you to self-design your integration processes.

Concluding

We have seen that real-time integration can have a huge impact on your business. Customer needs are swiftly serviced, business performance information is current, and production and delivery are focused towards your business’ most important needs.
This continuous integrated data flow can be easily achieved with DBSync’s Salesforce Streaming Connector, enabling you to create the real time architecture that optimizes your business.
Find this exciting? Go to our website (https://www.mydbsync.com ) and try it for free. If you have any questions, contact us, and one of our representatives will get in touch with you.

Monday, July 16, 2018

Data archival solutions with cloud replication



Are you facing data storage regulatory compliance problems? Fortunately, data archiving is one of those areas where regulatory compliance can be made easy and effective if you have the right archival solutions.
But first, let’s understand what data archival is all about, and why it is different from the normal backups that you are doing.

What is Enterprise Data Archival

Data archival is the storage of all the data that you need to keep because it may be occasionally useful, but that it is no longer used in everyday operations.
Good data archival practices have many benefits, such as reducing the cost of primary storage, and as a direct consequence the cost of backup storage; having good original proofs in cases of legal problems; and very importantly, helping you avoid worrying about regulatory compliance.
Let us now examine some of the most important law requirements.

Data Archival

Data archival regulatory compliance relates to data that is still important and that maybe needed in the future, particularly in case of litigations. The following are the most important regulations that may affect your business’ data archival:

SOX (Sarbanes-Oxley Act):

This act was passed after the Enron scandal, and it is focused on financial reporting. It applies to any company that is listed on the NASDAQ or the NYSE. Its main concern is to implement controls that ensure the completeness, correctness and quick access of information. Within this law, some businesses are specifically targeted. For example, accounting firms that audit publicly traded companies are required to retain audit records for no less than seven years after the completion of an audit.

PCI DSS (Payment Card Industry Data Security Standard)

This is a standard that businesses using online payments are required to comply with, particularly if they use credit and debit cards, such as Visa or Mastercard.
The standard stresses information access permissions and, it is based on the Principle of least privilege. This principle basically says that access to information should only be granted to those who necessarily need it, thus, reducing the risk for the data to be compromised. In addition, it requires the use of encryption.

FINRA (Financial Industry Regulatory Authority)

Basically, FINRA aims at auditing banks and financial institutions, to ensure their good behavior. Among the most important requirements, FINRA 3110 entails that you preserve your accounts, records, memoranda, books and correspondence in conformity with all applicable regulations, statements, and rules under SEC 17a-3. Several other parts of this act, also relate to ensure SEC compliance.

GDPR

Although you may think that if you are not in Europe, the General Data Protection Regulation (GDPR) does not apply to your business, you must be aware that it obliges any organization processing personal data of EU citizens, such as name, IP address, location, religion and ethnicity among others. Thus, if your company deals with clients in the EU, be sure that you abide to the GDPR requirements as of May 2018.
This regulation basically aims at protecting people’s information and presents new challenges for businesses. For example, Article 20 of the GDPR provides the Right of Data Portability. It basically says: if you’ve got some data about me, then I’m allowed to access it when I need it and to require you to provide it to anyone I choose. This applies to current and historical data.
Whilst regulatory compliance implies many efforts from your side, it can also be an opportunity to organize your data. The good news is that the pain implied can be reduced by having a sound reference architecture, and good data archival solutions.

Enterprise Data Archival: Introducing DBSync Cloud Replication and CDM

DBSync Cloud Replication and CDM is an application that helps you to concentrate on your business, not on compliance. DBSync assists you with this, by automating many of the tasks, increasing archival efficiency and reducing error generation; while keeping your business secure and cost-effective. Some of its most important features include:
·         The possibility to auto-create a schema and new fields by synchronizing an application to a database.
·         The possibility of having real time integration via outbound messages.
·         The capacity to obtain data from an application, such as Salesforce, and a database. Then, merge the data, use it according to your needs, and load the results obtained back to the application.

Final words

Improving data management has become critical in our present days. To learn more about getting your organization’s data archival up to speed, go to our website (https://www.mydbsync.com ), where you can find more information about our data archival solution.

Friday, July 13, 2018

Why data replication should not be done using ESB-based integration tools



This is one of the common questions we get when prospects come looking for data replication tools. It’s more a question of Integration design patterns than of product implements.

Let’s get started with what an ESB is – Enterprise Service Bus. This is an integration design pattern where messages are passed so that one or more Message Listeners can listen and consume the message – store and forward. These messages—like, say, emails—have a header (from and to), a payload (the message), and perhaps attachments. Based on the ESB, there might be some limitation on payload and attachments sizes. 

The Flow is like This: 

App produces message -> ESB receives message (in a queue) -> Based on Routing rules, ESB routes message -> Listener Consumes Message -> Likely maps/translates data -> Saves / forwards to another app/queue -> confirm message is received -> ESB tags and stores message as processed.

Notice the ones in “bold”. These are places where data flow can “choke” or “build up” if there is a high flow of large data sets.

Now look at Data Replication: you have a source of data, be it a database (common) or a Cloud Application (like Salesforce). In data replication, you would require a complete backup of both schema and data changes. The application is expected to identify schema changes and update to target (without the need for remapping), so interpreting schema changes and having the ability to adjust target schema changes becomes important. The ability to process a large number of rows is necessary. 
One of the common ways that most databases replicate is using their transactional logs (when you look under the hood of master–slave replication). When you have disparate applications like Salesforce and Oracle, then you have to rely on query-> extract -> interpret change -> check for target source duplicate -> load on another system.

Ok, so let’s now look at why ESB-based apps might not be the right choice:

  • ·         ESB requires store and forward, which might not be necessary for data replication. While you can debate that it will work (yes you can make it work), it will be slow and overly complicated.
  • ·         ESB in general is considered to have the higher overhead of operation management and requires higher uptime as it’s mostly used for distributed app integration. Replication usually is run on batch (or scheduled time) or, in the case of master–slave, a lot more real time than what ESBs are designed for.
  • ·         Managing schema changes often requires ESBs to remap some of the message flows. Some of our clients really dislike this, in that not only do they have to track source and target schemas, but also often trigger a “Change Management” request up the IT chain, which can take weeks or months to get over. Data Replication tools usually automatically adjust target schemas.

When you look at the Integration Tools market, the industry has segmented itself, with one group going the ESB or Message queue route (which is slowly evolving into API-based integration) and that of data replication.

So let us see some of the common integration apps and how do they fit in:

MuleSoft- A leader in ESB-based integration and does quite well in Service-oriented architecture and does well in integrating apps like SAP and others. They are also introducing API management.

Kafka- Open-source Messaging platform, very popular in high-volume messaging, especially with IoT and big data. It requires smaller messages size.

GoldenGate (by Oracle) – a leader in data replication between different databases. Does not yet have Cloud application data replication.

DBSync – Cloud Data Replication uses direct replication technique while iPaaS Cloud Workflow is more a store and forward.

There are many more; perhaps a good place to look is Gartner’s Data Integration and Gartner’s Integration-as-a-Service magic Quadrants to see which are leading the pack.