In this step-by-step tutorial I’ll show you how to use Talend Open Studio and the Twitter Components Pack to connect to Twitter, do a simple REST query and build a trivial relevance report on top on it. There’re tons of similar Talend tutorials out there, but no one is focused on my Twitter components pack, which let you do queries and result parsing without writing a single line of custom code. So let’s go into this 101 crash course on how to download tweets and build a real-world analysis on it.
Talend Open Studio is a handy ETL tool which amazing extending capabilities and a complete set of tools for building new custom components as I showed in several posts in the past. Talend offers an automatic way to install components through their official marketplace. However, that place is not famous for UX and for being attractive for developers and, as a matter of facts, components hosted there are often poor and outdated. Here I’m going to show a general way to install custom components which works for both Talend Exchange components and third-party hosted ones. For hard-core developers, I’ll also show how to compile a component starting from source code.
In the majority of Talend tutorials related to database operations I found there’s no or little use of RDBMS prepared statements. To build or parametrize queries, the most part of Talend users and developers seems to prefer a pure string concatenation approach. But this is absolutely a bad habit, since it offers the side to some important security flaws and doesn’t make use of the caching mechanisms of modern RDBMS. Although the guys at Talend really don’t make your life easier because of some choices in I/O DB components, It’s still possible to design a job which make use of PreparedStatements at full extent. In this tutorial I’m going to introduce a technique for some common use cases, while hardening the security and improving the debugging speed at the same time.
With the new year, Packt Publishing has reinforced its offering of reference guides on Open Source Analytics and Business Intelligence tools with this brand new Talend Open Studio Cookbook. Thanks to the publisher, which gave me a very early copy of the book to review, I had the time to read it twice and have a good understanding of the bundled code. I’m now able to write a complete review, especially focusing on target readers’ needs and on differences with the other Talend book on Packt’s offering. That Talend for DI primer gave us a first idea on how a professional-made reference guide on Talend would look like, but this one is a completely different matter and took the subject from a different perspective: the coder side of the Moon.
This review will be organized in three sections. In the first part, I’m going to go deep on the book content and presentation. In the middle part, I will focus on potential readers’ expectations and gains. Finally, in the last section I’m going to summarize my conclusions.
There are plenty of scenarios when one would benefit to do a cross-over between Talend Open Studio and R. The first is perfect for even complex ETL tasks, which by their very basic nature involves massive data I/O, manipulation, federation and governance, but it completely lacks any kind of serious statistical tool.
On the other hands, R is an absolute standard for statisticians, with a huge amount of external packages for practically any possible kind of analysis one could imagine, but even simple data operations must be hand-coded. R language is a very expressive and extensible data language, but one perhaps would prefer to spend time reasoning on the predictive model, rather than writing code to get the data out from the database. This is particularly true in data exploitation scenarios, but also in rapid prototyping and, generally speaking, in the whole business world.
If it’s not enough, R is basically a data language plus a command line executor. This is historically common for statistical software (just think to SAS) so it’s not a flaw on its own. But in real life Business Intelligence life-cycle, you probably have a corporate standard, a service bus, a protocol for data transfer and so on. A better interface with R is really advisable.
This is possible using a custom optional component made by me for Talend. In this tutorial I’ll show you how to use R to build a simple predictive model with data coming from Talend and how to get results back to Talend himself, for all your ETL good habits.
Talend Open Studio is a very nice open and scalable platform for data integration. This eclipse-based application is made of components, each of them absolving a particular data processing task. Although not very well documented by the vendor (but luckily someone else filled the gap), a perspective is provided to let the user the possibility to build new components. The developing flow is really primitive, as even the majority part of official out-of-the-box TOS components are made in a procedural coding style. This is not a fail in the very sense of the word, as this is the most common style in ETL environments for historical reasons, when speed was essence. But in the new world of Big Data, nosql and graph databases, and with modern hardware, this approach is totally inadequate. In the past, I presented a way to build components in a maven-aided environment. Starting from that, I built a OOP framework that, in its intents, aims to force and help the developer to make components in an object oriented flavour.
In this article I would like to show you the basic idea behind my framework and the advantages you could obtain using it in terms of coding speed and code reliability. Then, we’re going to explore the framework itself and x-ray a Talend component. You’ll be surprised on how easy is to build a component using design patterns!