Articles related to trading software development and trading platforms aiding automated investment operations.

How Artificial Intelligence will revolutionize wealth management

by Michal Rozanski, CEO at Empirica

Most wealth managers are in deep denial about robo advice. They say they need human interaction in order to understand the nuances of financial lives of their customers. And their clients value the human touch. They’re wrong. Soon robo advice will be much more efficient than human advice ever was.

In this post, we will share the results of our analysis on the most important areas where the application of machine learning will have the greatest impact in taking wealth management to the next level.

What Artificial Intelligence is and why you should care

 “Computers can only do what they are programmed to do.” Let us explain this is huge misconception, which was only valid because of limited processing power and memory capacity of computers. Most advanced programs which mimic specialized intelligences, known as expert systems, were indeed programmed around a set of rules based on the knowledge of specialists within the problem’s domain. There was no real intelligence there, only programmed rules. But there is another way to program computers, which makes them work more similarly to the functions of the human brain. It is based on showing the program examples of how certain problems can be solved and what results are expected. This way computers equipped with enough processing power, memory and storage are able to recognize objects in photographs, drive autonomous cars, recognize speech, or analyse any form of information which exhibits patterns.

 

We are entering the age where humans are outperformed by machines in activities related with reasoning based on the analysis of large amounts of information. Because of that finance and wealth management will be profoundly changed during the years to come.

 

Real advice – combining plans with execution

 A great area for improvement in finance management is the combination of long term wealth building with the current financial situation of the customer as reflected by his bank account. For robo-advisors, an integration with bank API opens the door to an ocean of data which, after analysis, can dramatically improve the accuracy of advice provided to the customer.

By applying a machine learning capabilities to a customer’s monthly income and expenses data, wealth managers will gain a unique opportunity to combine two perspectives – the long term financial goals of their customers and their current spending patterns. Additionally, there is the potential of tax, mortgage, loans or credit card costs optimization, as well as using information on spending history to predict future expenditures.

By integrating data from social media, wealth management systems could detect major changes in one’s life situation, job, location, marital status or remuneration. This would allow for automated real time adjustments in investment strategies of on the finest level, which human advisors are simply unable to deliver.

New powerful tools in the wealth manager’s arsenal

Hedge funds that are basing their strategies on AI have provided better results over the last five years than the average (source Eurekahedge). What is interesting is that the gap between AI and other strategies has been growing wider over the last two years, as advancements in machine learning accelerated.

The main applications of machine learning techniques in wealth management, can be categorized following cases:

  •       Making predictions on real-time information from sources such as market data, financial reports, news in different languages, and social media
  •       Analysis of historical financial data of companies to predict the company’s cash flow and important financial indicators based on the past performance of similar companies
  •       Analysis of management’s public statements and activity on social networks in order   to track the integrity of their past words, actions and results
  •       Help in accurate portfolio diversification by looking for uncorrelated instruments which match requirements of the risk profile
  •       Generation of investment strategies parametrized by goals such as expected risk profiles, asset categories, and timespan, resulting in sets of predictive models which may be applied in order to fulfill the assumptions

 To give an example of machine learning accuracy, the algorithms for sentiment analysis and document classification are already on acceptable levels, well above 90%.

Automated execution

When it comes to the execution of the actual orders behind portfolio allocation and rebalancing strategies, many robo-advisors are automating these processes passing generated orders to brokerage systems. The next step would be autonomous execution algorithms, that take under consideration the changing market situation and learn from incoming data, allowing for increased investment efficiency and reduced costs.

Machine learning can be applied to quantitative strategies like trend following, pattern recognition, mean reversion, and momentum, as well as the prediction and optimization of statistical arbitrage, and pairs trading. Additionally, there is a possibility to apply machine learning techniques in, already quite sophisticated, execution algorithms that help execute large orders by dividing them to thousands of smaller transactions without influencing the market while adjusting their aggressiveness to the market situation.

What’s interesting is that algorithms could also be trained to make use of rare events, like market crashes and properly react in milliseconds, already knowing the patterns of panic behaviour and shortages of liquidity.

Explaining the markets

In wealth management systems, if portfolio valuations are provided to the customers in real time, then so should explanations of the market situation. Every time the customer logs in to the robo-advisor, she should see all required portfolio information with a summary of market information relevant to the content of her portfolio. This process includes the selection of proper articles or reports concerning companies from the investor portfolio, classification and summarization of negative or positive news, and delivering a brief overview.

Additionally, machine learning algorithms can be used to discover which articles are read by customers and present only those type of articles that were previously opened and read by the customer.

The result will be not only the increase in customer understanding but also, by providing engaging content to investors, the increase in their engagement and commitment to portfolio strategy and wealth management services.

 

Talking with robots

The ability to deliver precise explanations of the market situation in combination with conversational interfaces aided by voice recognition technology will enable robo-advisors to provide financial advice in a natural, conversational way.

Voice recognition is still under development, but it could be the final obstacle on they way to redesigning human-computer interaction. On the other hand, thanks to deep learning, chatbot technology and question answering systems are getting more reliable than ever. KAI, the chatbot platform of Kasisto, who has been trained in millions of investment and trade interactions, already handles 95 % of all customer queries for India’s digibank.

Decreasing customer churn with behavioral analysis

The ability to track all customer actions, analyzing them, finding common patterns in huge amounts of data, making predictions, and offering unique insights for fund managers delivers a powerful business tool not previously available to wealth managers. What if nervousness caused by portfolio results or market situation could be observed in user behaviour within the system?  This information, combined with the results of investments and patterns of behaviour of other investors, can give a wealth manager the possibility to predict customer churn and react in advance.

When speaking with wealth management executives that are using our robo-advisory solutions, they indicate behavioural analysis as one of the most important advancements to their current processes. Customers leave not only when investment results are bad, but also when they are good if there is a fear that the results may not be repeated in the future. Therefore, the timely delivery of advice and explanations of market changes and the current portfolio situation are crucial.

The same model we used to solve the behavioral analysis problem has been proven to predict credit frauds in 93.07% of cases.

Summary

Other areas of applying machine learning in the processes supporting wealth management services could be:  

  •       Security based on fraud detection which actively learns to recognize new threats
  •       Improving sales processes with recommendations of financial products chosen by similar customers
  •       Psychological profiling of customers to better understand their reactions in different investment situations      
  •       Analysis and navigation of tax nuances   
  •       Real estate valuation and advice

 Implementing these AI functions in wealth management systems will be an important step towards the differentiation of the wealth managers on the market. Today’s wealth managers’ tool set will look completely different in five years. Choosing an open and innovative robo-advisory system that tackles these future challenges is crucial. Equally important will be wealth managers’ incorporation of data analytic processes and the use of this data to help their customers.

Artificial intelligence is poised to transform the wealth management industry. This intelligence will be built on modern software platforms that combine data from different sources, process it, and transform it into relevant financial advice. The shift from data gathering systems to predictive ones that help wealth managers to understand the data, has already started. And wealth management is all about understanding the markets and the customers.

 

 

Now Crypto. Lessons learned from over 10 years of developing trading software

By Michal Rozanski, CEO at Empirica.

Reading news about crypto we regularly see the big money inflow to new companies with a lot of potentially breakthrough ideas. But aside from the hype from the business side, there are sophisticated technical projects going on underneath.

And for new cryptocurrency and blockchain ideas to be successful, these projects have to end with the delivery of great software systems that scale and last. Because we have been building these kinds of systems for the financial markets for over 10 years we want to share a bit of our experience.

Read more on how Empirica delivers its trading software development services

“Software is eating the world”. I believe these words by Marc Andreessen. And now the time has come for financial markets, as technology is transforming every corner of the financial sector. Algorithmic trading, which is our speciality, is a great example. Other examples include lending, payments, personal finance, crowdfunding, consumer banking and retail investments. Every part of the finance industry is experiencing rapid changes triggered by companies that propose new services with heavy use of software.

If crypto relies on software, and there is so much money flowing into crypto projects, what should be looked for when making a trading software project for cryptocurrency markets? Our trading software development projects for the capital and crypto markets as well as building our own algorithmic trading platform has taught us a lot. Now we want to share our lessons learned from these projects.

 

  1. The process – be agile.

Agile methodology is the essence of how software projects should be made. Short iterations. Frequent deliveries. Fast and constant feedback from users. Having a working product from early iterations, gives you the best understanding of where you are now, and where you should go.

It doesn’t matter if you outsource the team or build everything in-house; if your team is local or remote. Agile methodologies like Scrum or Kanban will help you build better software, lower the overall risk of the project and will help you show the business value sooner.

 

  1. The team – hire the best.

A few words about productivity in software industry. The citation is from my favourite article by Robert Smallshire ‘Predictive Models of Development Teams and the Systems They Build’ : ‘… we know that on a small 10 000 line code base, the least productive developer will produce about 2000 lines of debugged and working code in a year, the most productive developer will produce about 29 000 lines of code in a year, and the typical (or average) developer will produce about 3200 lines of code in a year. Notice that the distribution is highly skewed toward the low productivity end, and the multiple between the typical and most productive developers corresponds to the fabled 10x programmer.’.

I don’t care what people say about lines of code as a metric of productivity. That’s only used here for illustration.

The skills of the people may not be that important when you are building relatively simple portals with some basic backend functionality. Or mobile apps. But if your business relies on sophisticated software for financial transactions processing, then the technical skills of those who build it make all the difference.

And this is the answer to the unasked question why we in Empirica are hiring only best developers.

We the tech founders tend to forget how important it is to have not only best developers but also the best specialists in the area which we want to market our product. If you are building an algo trading platform, you need quants. If you are building banking omnichannel system, you need bankers. Besides, especially in B2B world, you need someone who will speak to your customers in their language. Otherwise, your sales will suck.

And finally, unless you hire a subcontractor experienced in your industry, your developers will not understand the nuances of your area of finance.

 

  1. The product – outsource or build in-house?

If you are seriously considering building a new team in-house, please read the points about performance and quality, and ask yourself the question – ‘Can I hire people who are able to build systems on required performance and stability levels?’. And these auxiliary questions – can you hire developers who really understand multithreading? Are you able to really check their abilities, hire them, and keep them with you? If yes, then you have a chance. If not, better go outsource.

And when deciding on outsourcing – do not outsource just to any IT company hoping they will take care. Find a company that makes systems similar to what you intend to build. Similar not only from a technical side but also from a business side.

Can outsourcing be made remotely without an unnecessary threat to the project? It depends on a few variables, but yes. Firstly, the skills mentioned above are crucial; not the place where people sleep. Secondly, there are many tools to help you make remote work as smooth as local work. Slack, trello, github, daily standups on Skype. Use it. Thirdly, find a team with proven experience in remote agile projects. And finally – the product owner will be the most important position for you to cover internally.

And one remark about a hidden cost of in-house development, inseparably related to the IT industry – staff turnover costs. Depending on the source of research, turnover rates for software developers are estimated at 25% to even 38%. That means that when constructing your in-house team, every fourth or even every third developer will not be with you in a year from now. Finding a good developer – takes months. Teaching a new developer and getting up to speed – another few months. When deciding on outsourcing, you are also outsourcing the cost and stress of staff turnover.

 

  1. System’s performance.

For many crypto projects, especially those related with trading,  system’s performance is crucial. Not for all, but when it is important, it is really important. If you are building a lending portal, performance isn’t as crucial. Your customers are happy if they get a loan in a few days or weeks, so it doesn’t matter if their application is processed in 2 seconds or in 2 minutes. If you are building an algo trading operations or bitcoin payments processing service, you measure time in milliseconds at best, but maybe even in nanoseconds. And then systems performance becomes a key input to the product map.

95% of developers don’t know how to program with performance in mind, because 95% of software projects don’t require these skills. Skills of thinking where bytes of memory go, when they will be cleaned up, which structure is more efficient for this kind of operation on this type of object. Or the nightmare of IT students – multithreading. I can count on my hands as to how many people I know who truly understand this topic.

 

  1. Stability, quality and level of service.

Trading understood as an exchange of value is all about the trust. And software in crypto usually processes financial transactions in someway.

Technology may change. Access channels may change. You may not have the word ‘bank’ in your company name, but you must have its level of service. No one in the world would allow someone to play with their money. Allowing the risk of technical failure may put you out of business. You don’t want to spare on technology. In the crypto sapce there is no room for error.

You don’t achieve quality by putting 3 testers behind each developer. You achieve quality with processes of product development. And that’s what the next point is about.

 

  1. The DevOps

The core idea behind DevOps is that the team is responsible for all the processes behind the development and continuous integration of the product. And it’s clear that agile processes and good development practices need frequent integrations. Non-functional requirements (stability and performance) need a lot of testing. All of this is an extra burden, requiring frequent builds and a lot of deployments on development and test machines. On top of that there are many functional requirements that need to be fulfilled and once built, kept tested and running.

On many larger projects the team is split into developers, testers, release managers and system administrators working in separate rooms. From a process perspective this is an unnecessary overhead. The good news is that this is more the bank’s way of doing business, rarely the fintech way. This separation of roles creates an artificial border when functionalities are complete from the developers’ point of view and when they are really done – tested, integrated, released, stable, ready for production. By putting all responsibilities in the hands of the project team you can achieve similar reliability and availability, with a faster time to the market. The team also communicates better and can focus its energy on the core business, rather than administration and firefighting.

There is a lot of savings in time and cost in automation. And there are a lot of things that can be automated. Our DevOps processes have matured with our product, and now they are our most precious assets.

 

  1. The technology.

The range of technologies applied for crypto software projects can be as wide as for any other industry. What technology makes best fit for the project depends, well, on the project. Some projects are really simple such as mobile or web application without complicated backend logic behind the system. So here technology will not be a challenge. Generally speaking, crypto projects can be some of the most challenging projects in the world. Here technologies applied can be the difference between success and failure. Need to process 10K transaction per second with a mean latency under 1/10th ms. You will need a proven technology, probably need to resign from standard application servers, and write a lot of stuff from scratch, to control the latency on every level of critical path.

Mobile, web, desktop? This is more of a business decision than technical. Some say the desktop is dead. Not in trading. If you sit whole day in front of the computer and you need to refer to more than one monitor, forget the mobile or web. As for your iPhone? This can be used as an additional channel, when you go to a lunch, to briefly check if the situation is under control.

 

  1. The Culture.

After all these points up till now, you have a talented team, working as a well-oiled mechanism with agile processes, who know what to do and how to do it. Now you need to keep the spirits high through the next months or years of the project.

And it takes more than a cool office, table tennis, Xbox consoles or Friday parties to build the right culture. Culture is about shared values. Culture is about a common story. With our fintech products or services we are often going against big institutions. We are often trying to disrupt the way their business used to work. We are small and want to change the world, going to war with the big and the powerful. Doesn’t it look to you like another variation of David and Goliath story? Don’t smile, this is one of the most effective stories. It unifies people and makes them go in the same direction with the strong feeling of purpose, a mission. This is something many startups in other non fintech branches can’t offer. If you are building the 10th online grocery store in your city, what can you tell your people about the mission?

Read more on how Empirica delivers its crypto software development services

 

Final words

Crypto software projects are usually technologically challenging. But that is just a risk that needs to be properly addressed with the right people and processes or with the right outsourcing partner. You shouldn’t outsource the responsibility of taking care of your customers or finding the right market fit for your product. But technology is something you can usually outsource and even expect significant added value after finding the right technology partner.

At Empirica we have taken part in many challenging crypto projects, so learn our lessons, learn from others, learn your own and share it. This cycle of learning, doing and sharing will help the crypto community build great systems that change the rules of the game in the financial world!

 

 

Algorithmic crypto trading: market specifics and strategy development

By Marek Koza, Product Owner of Empirica’s Algo Platform

Among trading professionals, interest in crypto-currency trading is steadily growing. At Empirica we see it by an increasing number of requests from trading companies, commonly associated with traditional markets, seeking algorithmic solutions for cryptocurrency trading. However, new crypto markets suffer from old and well-known problems. In this article, I try to indicate the main differences between traditional and crypto markets and take a closer look at a few algorithmic strategies that are currently effective in the crypto space. Differences between crypto and traditional markets constitute an interesting and deep subject in itself which is evolving quickly as
the pace of change in crypto is also quite fast. But here I only want to focus on algorithmic trading perspectives.

 

Read more about our tool for market making strategies for crypto exchanges  – Liquidity Engine

 

LEGISLATION

First, there is a lack of regulations in terms of algorithmic usage. Creating DMA algorithms on traditional markets requires a great deal of additional work to meet reporting, measure standards as well as limitations rules provided by regulators (e.g., EU MiFIDII or US RegAT). In most countries crypto exchanges have yet to be covered by legal restrictions. Nevertheless, exchanges provide their own internal rules and technical limitations which, in a significant way, restricts the possibility of algorithmic use, especially in HFT field. This is crucial for market-making activities which now requires separated deals with trading venues.

 

DERIVATIVES

As for market-making, we should notice an almost non-existent derivatives market in the cryptoworld. Even if a few exchanges offer futures and options, they only apply to a few of the most popular cryptocurrencies. Combining it with highly limited margin trading possibilities and none of index derivatives (contracts which reflect wide market pricing), we see that many hedging strategies are almost impossible to execute and may only exist as a form of spot arbitrage.

 

 

 

 

 

 

 

 

 

 

 

 

As for market-making, we should notice an almost non-existent derivatives market in the cryptoworld. Even if a few exchanges offer futures and options, they only apply to a few of the most popular cryptocurrencies. Combining it with highly limited margin trading possibilities and none of index derivatives (contracts which reflect wide market pricing), we see that many hedging strategies are almost impossible to execute and may only exist as a form of spot arbitrage.

 

DECENTRALIZATION

The above-mentioned facts are slightly compensated for by the biggest advantage of blockchain currencies – fast and direct transfers around the world without banks intermediation. With cryptoexchange APIs mostly allowing automation of withdrawal requests, it opens up new possibilities for algorithmic asset allocation by much smaller firms than the biggest investment banks. This is important due to two things. Firstly, there is still no one-stop market brokerage solution we know from traditional markets. Secondly, cryptocurrencies trading is distributed among many exchanges around the world. It could therefore be tricky for liquidity seekers and heavy volume execution. It implies there is still much to do for execution algorithms such as smart order routing.

 

CONNECTIVITY

A smart order routing strategy GUI

Another difference is direct market access for algorithmic trading. While on traditional markets DMA is costly, cryptocurrency exchanges provide open APIs for all their customers that may be used without upfront prerequisites. Although adopted protocols are usually easy to implement, they are often too simplistic. They do not usually offer advanced order types. Besides, order life-cycle status following is cumbersome and trading protocols differ among exchanges since each one requires its own implementation logic. That makes a costly technical difference compared to traditional markets with common standards, including FIX protocol.

 

MARKET DATA

Fast, precise and up-to-date data are crucial from an algorithmic trading perspective. When a trader develops algorithms for crypto-trading, she should be aware of a few differences. APIs provided by crypto-exchanges give easy access to time & sales or level II market data for everyone for free. Unfortunately, data protocols used in the crypto space are unreliable and trading venue systems often introduce glitches and disconnections. Moreover, not every exchange supports automatic updates and an algorithm has to issue a request every time it needs to check on the state of a market, which is difficult to reconcile with algorithmic strategies.

The APIs of most exchanges allow downloading of historical time & sale data, which is important in the algorithmic developing process. However, historical level II data are not offered by exchanges. We should also notice that despite being immature, the systems of crypto trading venues are evolving and becoming more and more professional. This forces trading systems to follow and adapt to these changes, which adds big costs to systems’ maintenance. In the following sections I overview a few trading algorithms that are currently popular among crypto algo traders because of the differences between traditional and crypto markets listed above.

 

SMART ORDER ROUTING

Liquidity is and most probably will remain, one of the biggest challenges for cryptocurrency trading. Trading on bitcoin and etherium and all other altcoins with smaller market capitalisation, is split among over 200 different exchanges. Executing a larger volume on any type of assets often requires seeking liquidity on more than one trading venue. To achieve that, cryptocurrency traders may apply smart order routing strategies. These follow limit order books for the same instrument from different exchanges and aggregates them internally. When an investment decision is made, the strategy splits the order among exchanges that offer best prices for the instrument. A well-designed strategy will also manage partially filled orders left in the order book in case some volume disappears before the order has arrived at the market. This strategy could be combined with other execution strategies such as TWAP or VWAP.

Empirica algorithmic trading platform front-end app (TradePad) for crypto-markets.

 

ARBITRAGE

The days when simple crossexchange arbitrage was profitable with manual execution are over. Nowadays price differences among exchanges for the most actively trading crypto-assets, are much smaller than a year ago and transactional and transfer costs (especially for fiat) still remain at a high level. Trading professionals are now focused towards using more sophisticated arbitrage algorithms such as maker-taker or triangular arbitrage. The former works by quoting a buy order on one exchange, based on VWAP for a particular amount of volume from another exchange (the same instrument) decreased by expected fees and return. A strategy is actively moving quoted order and if the passive gets executed, it sends a closing order to the other exchange. As the arbitrage is looking for bid-bid and ask-ask difference and maker fees are often lower, this type of arbitrage strategy is more cost-effective.

Triangular arbitrage may be executed on a single exchange because it is looking for differences among three currency pairs which are connected to each other. To illustrate, let us use this strategy with BTCUSD, ETHUSD and ETHBTC pairs. This strategy keeps following order books of these three instruments. The goal is to find the inefficient quoting and execute trades on three instruments simultaneously. To understand this process, we should notice that ratio between BTCUSD and ETHBTC should reflect the ETHUSD market rate. Contrary to some FX crosses, all cryptocurrency pairs are priced independently. This creates numerous possibilities of using triangular arbitrage in crypto space.

 

MARKET MAKING

Market making should be considered more as a type of business than as just a strategy. The main task of a market maker is to provide liquidity to markets by maintaining bid and ask orders to allow other market participants to trade any time they need. Since narrow spreads and adequate prices are among the biggest
factors of exchange’ attractiveness, market making services are in high demand. On the one hand, crypto exchanges have special offers for liquidity providers, but on the other hand, they require from new coins issuers a market maker before they start listing an altcoin.

These agreements are usually one source of market maker income. Another one is a spread – a difference between a buy and a sell prices provided to the other traders. The activity of a market maker is related to some risks. One of them is inventory imbalance – if a market maker buys much more than sells or sells much more than buys, she stays with an open long or short position and takes portfolio risk, especially on volatile crypto markets. This situation may happen in markets with a strong bias, or when market maker is quoting wrong or delayed prices, which will immediately be exploited by arbitrageurs. To avoid such situations, market makers apply algorithmic solutions such as different types of fair price calculations, trade-outs, hedging, trend and order-flow predictions, etc. Technology and math used in market making algorithms are an interesting subject for future articles.

 

Read more about our tool for market making strategies for crypto exchanges  – Liquidity Engine

 

SUMMARY

Fast developing crypto markets are attracting a growing number of participants, including more and more trading professionals from traditional markets. However, the crypto space has its own specificity such as high decentralization, maturing technology and market structure. Compared to other markets, these differences make some strategies more useful and profitable than others. Arbitrage – even simple cross-exchange is still very popular. Market making services are in high demand. Midsized and large orders involve execution algorithms like smart order routing. At the end of the day to embrace the fast changing crypto environment, one needs algorithmic trading systems with an open architecture that evolves alongside the market.

 

To see the original article click the button on the right.

 

Blockchain meetup sponsored by Empirica, Wroclaw

Monday June 19th a beautiful sunny day in IT-friendly Wroclaw, tech start-ups and cryptocurrency enthusiast gather together at IT corner Tech meetup, sponsored by Empirica.

The event was planned to focus on key areas of current trends in Blockchain and Ethereum.

The event began with Mr Wojciech Rokosz, Ardeo CEO presentation. The session was dedicated to introduction to the economics of token. Explaining the new changes and updates we are and we will face in our economy with this huge entrance of virtual currencies.

The event later carried on with Mr Marek Kotewicz on introduction to Blockchain, Bitcoin and Ethereum. The session was summarizing the differences between Bitcoin and Ethereum.

The third and last part of the event was conducted with Mr Tomek Drwga, Blockchain meetup organizer,  diving deeper into smart contracts and programming ( introduction to Solidity) for Ethereum.

The event ended with open discussion between the audience and speakers, and visitors were served with beverages.

The Importance of Agile Software Development

The strategic value of software to companies continues to grow and businesses are increasingly seeking to software as the technology empowering and differentiating their businesses and products. Software is driving many of today’s key technology trends, such as cloud computing, freedom and social networking. Software embedded in goods can also be currently transforming numerous commercial businesses, such as the manufacturing, healthcare and communications industries. Organizations are challenged to create their business applications and software-driven merchandise faster. Compete, to successfully innovate and grow, organizations require experience and solutions to accommodate to competitive dynamics and customer needs.

Read more on how Empirica delivers its crypto software development services

 

Businesses utilize manual procedures and unsophisticated tools, including paper-based tactics and spreadsheets, to manage workflow throughout the software development lifecycle. These techniques are generally more suitable for smaller development projects handled by a team and cannot scale to meet the requirements of businesses and multi-team projects. The waterfall method gained prominence as the way to manage large software development projects. This strategy, which may take many months or years to complete, relies on rigid sequential execution of the many phases of the software development lifecycle, including analysis, design, coding, integration and testing that is final. Enterprises employing the waterfall method frequently structure internal branches around every development stage and utilize distinct legacy software tools for every phase and department, leading to siloed and disparate information, limited transparency and collaboration between teams, and increased risk of misalignment between software development and company initiatives. The next diagram illustrates the waterfall procedure:

 

Agile was introduced by a group of software visionaries in 2001 via a open letter. It represented a methodology for software creation and delivery designed to decrease costs and significantly improve quality, time-to-market and client satisfaction. Agile projects build software

 

Incrementally, in tiny batches, using iterations of one to four weeks which help keep development aligned with changing business needs. Agile is increasingly replacing waterfall processes across several industries because of the results it can provide. According to the Standish Group, software applications developed with Agile techniques have three times the success rate of applications developed using the waterfall process. The Standish Group defines an effective project as you delivered with functions and all the features, on budget, and on time. The following diagram illustrates the Agile methodology:

 

Agile techniques are being adopted by enterprises, replacing and disrupting the legacy software development methods that have comprised the market for application lifecycle software. According to IDC, the application lifecycle marketplace is comprised of this software configuration management, IT project and portfolio management, and automated software quality markets. In aggregate, IDC estimated these markets would reach $5.2 billion in 2012. Though these markets have been addressed by solutions supporting waterfall and other legacy methods of software development, we think that a transition to alternatives behind software development methodologies like Agile is ongoing. Our cloud-based platform of management alternatives is designed to address these markets and facilitate the adoption of Agile practices by businesses.

 

Furthermore, cloud-based alternatives have a small, but fast growing share of the total application development and installation market, which includes Agile management alternatives.

 

Organizations that develop business applications and software-related products face a range of business challenges. These are usually directly attributable to legacy software development methods. These challenges include:

 

  • Shortening time-to-market and increasing customer expectations. Competition for customers’ attention continues to intensify and, as a result, the importance of being first to market has significantly improved. With the proliferation of always-connected clients and mobile devices, Additionally, product reviews may be continuous and instantaneous. It becomes paramount that characteristics a product’s quality and consumer experience match or exceed customer expectations at launch, as poor client experiences can be readily shared. Customers’ demand for the speed and quality has set increased focus and pressure on how organizations handle the software development lifecycle.

 

  • Limited transparency into big development projects. Comprehensively monitoring the improvement of multi-team development projects and actively monitoring their priorities are challenges. These challenges are exacerbated when projects are big and teams are globally distributed. For projects with priorities and continuously-evolving demands, the ability to keep visibility into quality and progress is more challenging. Organizations who utilize legacy software management offerings may have separate applications for each role and phase of the development lifecycle. These organizations struggle with , out-of-date status info that is siloed and have limited into the work in advance. The ability to effectively adapt, optimize decisions and allocate resources necessitates a view of their software development process together with all teams.

 

  • For large-scale development projects, feedback and collaboration among developers, industry leaders and clients can be hard to coordinate. Collaboration must be continuous and the software development strategy has to allow for adaptation, since the comprehension of customer needs or business needs evolve.

 

  • Difficult transition to alternative development approaches and solutions. Approaches and tools used for managing the software development lifecycle frequently reflect. Because of this, there is a resistance to change, even if change can be demonstrated to boost time-to-market, productivity or software quality. Generally, legacy software management tools do not support the cycles utilised in Agile development. These applications are non-intuitive usually stiff, costly and, requiring implementations, increasing the challenge to some development business or attempting adjustments to its own processes.

 

  • Inflexible offerings. Conventional offerings for and approaches to managing the software development lifecycle can be hard to use or may lack the flexibility required to accommodate changing or varying goals and organizational needs. In addition to this downside, legacy offerings might require substantial resources and time to execute and can often be expensive. Further, after these offerings have been installed, updating, extending or adding new performance often necessitates bespoke development efforts which are time consuming and costly.

Read more on how Empirica delivers its crypto software development services

Modern monitoring software – things to look for and things to avoid

Monitoring software is in the base of a company’s IT stack. Without monitoring, organizations are blind to factors that affect performance, reliability, scalability and availability of systems in. Once installed, monitoring becomes essential to an organization’s performance and embedded into business and operational workflows. There are a number of industry trends that are currently changing the way organizations manage and use, deploy software applications and their underlying technology infrastructure. These trends are creating a significant opportunity to displace existing monitoring solutions and reshape the product categories, and include:

 

Read more on Crypto Exchange Monitoring Software

 

Modern technologies create significant challenges for IT. Technologies such as containers, microservices and serverless computing produce IT environments which are highly ephemeral in character compared to static legacy on-premise environments. The amount of SaaS platforms and open source tools offered to IT organizations has exploded providing significant options to developers to use the most powerful and agile services compared to a few standardized vendor suites from the on-premise globe. The scale of computing resources required from the cloud has improved exponentially and can be called upon in rapid, sometimes unpredictable, bursts of enlarged computing capacity compared to the static nature and smaller scale demanded of heritage information centres. The rate of change of application development from the cloud has improved dramatically as programs are being updated in days and minutes compared to weeks and years. These challenges have made it extremely difficult to gain visibility and insight into program and infrastructure performance and heritage monitoring tools have struggled to adapt.

 

We are in the early days of change. A seismic change is from static IT architectures to lively multi-cloud and architectures with ephemeral technologies such as containers, microservices and serverless architectures . According to Gartner, since the cloud becomes mainstream from 2018 to 2022, it will influence portions of business IT choices, with over $1 billion in enterprise IT spend at stake in 2019. The change permits businesses to improve agility, accelerate innovation and better manage costs. As companies migrate into the cloud and their infrastructure changes, so does the monitoring of this infrastructure. We are in the early days of this huge transformation. According to Gartner, only 5% of applications have been monitored as 2018. Worldwide spend on people cloud solutions, including infrastructure-as-a-service and platform-as-a-service is anticipated to grow from $60 billion in 2018 to roughly $173 billion in 2022, according to the IDC, representing a 30% compound annual growth rate.

 

Collaboration of development and operations teams is critically important. DevOps is a practice and culture characterized by developers and IT operations teams working collectively, each with ownership of the entire product development cycle. DevOps is necessary to achieving the agility and speed required for growing and maintaining modern applications, but they’ve been historically siloed. From the inactive, on-premise world, developers and IT operations personnel functioned independently with different objectives, priorities and resources. Developers would focus on writing code to create the best applications and operations teams will cause analyzing, scaling and deploying the applications. These teams generally did not collaborate and had systems and tools to track performance. Often the lack of communication between Dev and Ops teams would result in problems in program performance because the code may not have been written with the most efficient installation in mind, resulting in difficulty climbing, latency and other performance issues. The cycle of code rewrites may be protracted, but suitable from the static world where software releases occurred once a year. In the cloud era, where the frequency of software updates is days or minutes, this communication and coordination between Dev and Ops is necessary to ensuring rapid implementation and maximizing business performance. With mission-critical procedures being powered by software, Dev and Ops teams must collaborate to optimize both technology and business functionality. As a result, Dev and Ops teams need tools that provide a unified perspective of both technology and business performance so as to collaborate in real-time to optimize business success.

 

Organizations must digitally transform their companies to compete. There has been a fundamental shift in the way organizations use technology to interact with their clients and compete in the marketplace. This rise in influence is directly connected to the increased quantities of resources organizations are devoting to building distinguished mission-critical software. Poor technology performance negatively affects results and user experience in lost earnings, customer churn brand perception and employee productivity. Thus, companies across all industries are investing to digitally alter their businesses and increase the experience of their customers. At precisely the exact same time, their investments are significantly growing to monitor this digital transformation. According to Gartner, enterprises will quadruple their usage of APM due to increasingly digitalized business processes from 2018 through 2021 to achieve 20 percent of business applications.

 

Limitations of Offerings

 

Legacy industrial and homegrown technologies have been created to operate with monolithic, static and on-premise environments. These approaches typically exhibit the following critical limitations:

 

Not built to work with a wide set of technologies. Legacy technologies aren’t meant to operate in heterogeneous environments, with a plethora of vendors, software and technologies. Instead, these offers are built to work with a limited variety of heritage, on-premise vendor suites and can’t make the most of contemporary SaaS and open source technologies the industry has lately embraced.

 

Not built for development and operations teams cooperation. Legacy offerings often force development and operations groups to use disparate monitoring technologies that don’t share a frequent frame or set of data and analytics. This makes collaboration between Dev and Ops teams hard and can often cause sub-optimal company effects.

 

Deficiency of complex analytics. Legacy on-premise architectures deficiency scalability in collecting and processing large comprehensive datasets. Users of those legacy technologies frequently must manually collect and integrate information from disparate systems and IT environments. The shortage of data scale and aggregation can make it challenging to train contemporary machine-learning algorithms causing less precise insights.

 

Not constructed for cloud scale. Legacy technologies aren’t meant for cloud scale surroundings and fast, sometimes unpredictable, bursts of computing resources required by modern software.

 

Not constructed for dynamic infrastructure. Most offerings were constructed for static infrastructures where components of the infrastructure and applications are deployed once and rarely change. These solutions cannot visualize, and monitor technologies like clouds, containers and microservices, which can be highly dynamic and ephemeral in nature.

 

There are a number of contemporary commercial technologies that have attempted to tackle the shortcomings of legacy approaches. These approaches typically exhibit the following limitations:

 

Point solutions lack depth of visibility and insight. Point solutions can’t offer integrated infrastructure monitoring, program performance monitoring and log management on a single platform and therefore, lack the required visibility, insight and context for optimal collaboration.

 

Monitoring sprawl exacerbates alert fatigue. Disparate tools frequently exacerbate the awake fatigue suffered by many organizations. Gartner notes the need for companies to trim down the amount of monitoring tools used, which in the case of bigger enterprises is more than 30, while some smaller businesses have monitoring tools ranging in number from three to 10.

 

Difficult hard to use and to set up. These technologies often have complex implementation processes requiring significant specialist services. These offerings are complicated to use, requiring extensive upfront and continuing training and time commitment.

 

These offerings are intended to tackle very specific use cases for a small cadre of users and can require heavy implementation expenses and services so as to derive value. They aren’t easily extensible to a extensive set of use cases to get a larger number of technology and business users.

 

Key Strengths of modern solutions

 

Old model of siloed developers and IT operations engineers is broken, and that legacy tools used for monitoring static on-premise architectures don’t work in modern cloud or hybrid environments. Cloud-native platform empowers development and operations teams to collaborate, fast assemble and enhance software, and drive business performance. Empowered by out-of-the box functionality and simple, self-service installation, customers can quickly deploy the stage to offer application- and infrastructure-wide visibility, often within minutes.

 

Built for lively cloud infrastructures. Our revolutionary platform was created in the cloud and has been built to work with transient cloud technologies for example microservices, containers and serverless computing. Our data model was built to operate at cloud scale with dynamic data will process more than 10 trillion events a day and collections.

 

Our system is searchable with out-of-the-box integrations, customizable dashboards, real-time visualization and prioritized alerting. The platform is set up in a setup process within seconds, enabling users to derive worth without implementation or any technical training or customization. It is extensible across a vast selection of use cases to a set of developers, operations engineers and business users. As a result, our platform used every day and is integral to company operations, and our customers find value in the solution as time passes.

 

Integrated information platform. We were the first to unite the”three pillars of observability” – metrics, traces, and logs – with the debut of our log management solution in 2018. Nowadays, our platform unites infrastructure monitoring, application performance monitoring, log management, user experience monitoring, and network performance monitoring in a single integrated data platform. This approach increases efficiency by decreasing the expense and friction of trying to glean insights from systems. We are able to provide a unified view across the IT stack, including infrastructure and application performance, as well as the real time events. Each of our products is incorporated and taken together provide the ability to see metrics, traces and logs side-by-side and perform correlation analysis.

 

Constructed for collaboration. Our platform was built to break down the silos between developers and operations teams in order to help organizations adopt DevOps practices and enhance overall business performance. We provide development and operations teams with a set of tools to develop a joint comprehension of application performance and insights to the infrastructure supporting the applications. Additionally, our customizable and dashboards can be shared with business organizations to provide them with actionable insights.

 

Cloud agnostic. Our system is designed to be deployable across all environments, including public cloud, personal cloud, on-premise and multi-cloud hybrid environments, enabling organizations to diversify their infrastructure and decrease individual vendor dependence.

 

Ubiquitous. Cloud systems are often deployed across the entire infrastructure of a customer, which makes it ubiquitous. In comparison to legacy systems that are frequently used exclusively by a few users within a business’s IT operations group, modern systems ought to be a part of their lives of developers, operations engineers and business leaders.

 

Integrates with our clients’ environments that are complex. We empower development and operations groups to harness the complete range of SaaS and open source tools. We have over 350 out-of-the-box integrations with technologies to provide substantial value to our customers without the need for professional services. Our integrations provide for detailed data point aggregation and up-to-date, high-quality customer adventures across heterogeneous IT environments.

 

Powered by machine-learning and analytics. Our system ingests amounts of information to our data warehouse that is unified. We create actionable insights using our advanced analytics capabilities. Our platform includes machine learning that can cross-correlate metrics, clips and traces to identify outliers and notify consumers of potential anomalies before they impact the company.

 

Scalable. Our SaaS platform is highly scalable and is delivered via the cloud. Our platform is massively scalable currently monitoring more than 10 trillion occasions per day and millions of containers and servers at any point in time. We offer easily accessible data retention at complete granularity for extensive intervals, which may provide clients with a comprehensive view of the historical data.

 

Key Benefits

 

Our systems provides the following key benefits to our customers:

 

Enable operational efficiency. Our solution is easy to set up, which eliminates the requirement for services that are professional and heavy implementation costs. We have over 350 out-of-the-box integrations with key technologies, from which our customers can derive value, preventing internal development costs and services necessary to create those integrations. Our customer-centric pricing model is tailored to customers’ desired usage requirements. For example, our log management solution has differentiated pricing for logs indexed versus logs ingested. Our platform enables customers to better understand the operational demands of their software and IT environments, allowing greater efficiency in resource allocation and spend on cloud infrastructure.

 

Employing APM infrastructure and log info in our system, our customers can quickly isolate the source of application issues in one place where they would be required to spend hours trying to explore using tools. Additionally, our machine learning algorithms have been trained on the amount of data that our customers send us to discover anomalies and predict failures in client systems in real time, something that’s not possible to do.

 

Improve agility of development, operations and business teams. We remove the silos of development and operations teams and provide a platform that enables agile and efficient development through the adoption of DevOps. Our platform enables development and operations teams to collaborate with a shared understanding of information and analytics. This helps them develop a joint understanding of application performance and shared insights to the infrastructure behind the applications. Additionally, for businesses, our customizable and easy-to-understand dashboards could be shared with business organizations to supply them with real time actionable insights into company performance.

 

Read more on Crypto Exchange Monitoring Software

 

Accelerate digital transformation. We empower customers to take advantage of the cloud preserve and to develop mission-critical applications with agility and with confidence in the face of business and time stress and complexity of infrastructure. Because of this, our system helps accelerate innovation cycles, and provide exceptional digital encounters and optimize business performance.

Introduction to Liquidity Metrics

This paper offers a summary of indicators which may be used to demonstrate and examine liquidity developments in financial markets. These measures are employed in foreign markets, currency, and capital markets to exemplify their usefulness. Lots of measures have to be considered since there isn’t any single theoretically appropriate and approved measure to ascertain a market’s level of liquidity and since market-specific variables and peculiarities have to be considered.

 

Read more about our tool for measuring crypto exchange quality – Liquidity Analytics Dashboard

 

Liquid markets are perceived as desired due to the advantages they supply, such as allocation and data efficiency. The advantage might not be accurate for investors jointly. As Keynes noted (1936, p. 160):”For the simple fact that every individual investor selects himself that his devotion is”liquid” (although this cannot be accurate for many investors jointly ) calms his nerves and leaves him much more prepared to conduct a threat.” Consequently, recent crises in financial markets, particularly, have sparked research about the way to gauge the condition of market liquidity and to better forecast and protect against liquidity crises.

 

This paper has two functions. It offers a summary of numerous distinct theories associated with liquid financial markets.

 

Analysts motivated this job. Like Borio (2000), who reports that at the run-up to financial disasters, markets frequently seem unnaturally liquid, but through times of anxiety, liquidity will vanish.

 

Market participants comprehend a financial advantage liquid, should they can sell considerable quantities of the advantage without impacting its price. Liquid financial assets are characterized by having trade costs; simple timely and trading payoff; and trades with limited effect on the market price. The significance of a few of the qualities of liquid markets can alter over time. During times of equilibrium, for example, the perception of the asset’s liquidity could reflect trade costs. During times of anxiety and principles that are changing, instantaneous price detection and adjustment to a new balance becomes more significant.

 

Liquid markets often display five attributes:

  • tightness
  • immediacy
  • depth
  • breadth
  • resiliency

 

Tightness refers to trade costs, like the gap between buy and sell prices, such as the bid-ask spreads in markets, as well as costs. Immediacy signifies the rate with which orders could be implemented and, within this context too, settled, and consequently reflects, among other items, the efficacy of their trading, clearing, and settlement systems. Breadth implies that orders are big and numerous in bulk with minimal effect on prices. Resiliency is a feature of markets in which orders flow to fix order imbalances, which are inclined to move prices away from what fundamentals warrant. Depth refers to the existence of abundant orders, either actual or easily uncovered of potential buyers and sellers, both above and below the price at which a security now trades. 

 

These conditions reflect various measurements of the degree to which an asset immediately and with no costs can be changed into legal tender.

 

In these conditions are to some degree overlapping. The majority of the available data do not correspond with those measurements, which disrupts their measurement. A variety of aspects have to be considered, because they influence the measurements of liquidity. They vary in the microstructure of this market, the bank’s implementation of its policy.

 

Knowing the microstructure of this market is crucial, when proxies, such as bid-ask spreads and turnover ratios, are utilized as liquidity signs. A market may be a platform which enables sellers and buyers to interact, a physical place. Professors have a world in your mind using a Walrasian auctioneer performing a price tätonnement procedure ensuring trading in market clearing prices. In summary, prices are a statistic. In the professional’s world, however, trading can occur in a variety of platforms (as an example, trader or auction markets) in non market clearing prices due to factors like market illiquidity.

 

It is contended that traders offer liquidity, because they offer a market. But because traders usually attempt to square their positions maintain a predetermined structural position prior to the close of the day they just “supply” liquidity by taking stock positions provided that they presume sellers and buyers will continue to emerge. In an auction market, sellers and prospective buyers distribute orders, and a digital system or agents will suit them. Auction markets are order or price could be continuous if there are trades and driven. Market intermediaries in auction systems can additionally take stock rankings in order to ease liquidity (e.g., so-called experts in broadly traded securities). Trading systems make it possible for participants to submit limit-orders, which enhance the liquidity. The intermediaries having access to the trading strategies can cover their costs by charging a commission or else they quote ask and bid prices to be paid by the sellers and buyers.

 

A distinction is made between the market, in which problems are offered, and also the market, where individuals who’ve Purchased the problems at the market can resell them. The market consequently provides liquidity.

 

It’s very important to comprehend the reporting demands of trades in markets prior to trading volumes may be utilized as a liquidity index.

 

An advantage is liquid if it can be converted to legal tender, which each definition is liquid. Some financial statements, such as require deposits, are almost perfectly liquid–provided that the credit institution is liquid as they may be converted without cost or delay to cash during regular conditions, while the conversion of different claims to legal tender can involve agents’ commissions, settlement delays, etc.. The emphasis is on trade costs and immediacy. It’s regarding the ease by which, in the lack of info changing an asset’s fundamental price quantities of this asset could be disposed of quickly at a sensible price.

 

A financial market’s liquidity is dependent upon the substitutability among the assets traded in a market, and the way liquid every one of those assets are. Whether there are issuers in the bond markets and equities markets, credit risk could protect against substitutability and result in segmentation of this market. Regardless of having the exact same issuer, human assets might nevertheless have distinct attributes, for example different maturities on the market for government securities, distinct voting rights for preference stocks, etc.. This aggregation problem leaves difficult an effort to employ measures with the goal of measuring a market’s liquidity.

 

This paper explains measures to judge an asset’s market liquidity with a view to evaluate whether a financial market, or in minimum a few of its sections, can be distinguished as liquid.

 

Read more about our tool for market making strategies for crypto exchanges  – Liquidity Engine

 

Our next article will classify liquidity measures in line with this size they greatest measure. Additionally, it discusses factors that might impact capability and their interpretation to catch a specified facet of liquidity. Issues related to assemble the measures will be also discussed. Section Ill uses the liquidity measures to the market, currency, and capital markets of a group of nations. Section IV lists a few of the qualitative aspects that are important to look at when assessing the liquidity measures across markets and states. Section V notes liquidity measures during times of stress may vary.

 

Who is moving FinTech forward in continental Europe? Thoughts after FinTech Forum on Tour.

By Michal Rozanski, CEO at Empirica.

In the very centre of Canary Wharf, London’s financial district, in a brand new EY building, a very interesting FinTech conference took place – FinTech Forum on Tour. The invitation-only conference targeted the most interesting startups from the investment area (InvestTech) from mainland Europe. The event had representative stakeholders from the entire financial ecosystem. As Efi Pylarinou noted – the regulator, the incumbents, the insurgents, and investors, were all represented.

 

Empirica was invited to present its flagship product – Algorithmic Trading Platform, which is a tool professional investors use for building, testing and executing of algorithmic strategies. However, it was amazing to see what is happening in other areas of the investment industry. There were a lot of interesting presentations of companies transforming the FinTech industry in the areas of asset and wealth management, social trading and analytics.

 

The conference was opened with a keynote speech by Anna Wallace from FCA. Anna talked about the mission of FCA’s Innovation Hub; that is to promote innovation and competition in the financial technology field and to ensure that rules and regulations are respected. Whilst listening to Anna it became clear to me what the real advantage of London holds in the race to become the global FinTech capital – London has Wall Street, Silicon Valley and the Government in one place – and what’s most important, they cooperate trying to push things forward in one direction.

 

FinTech Forum on Tour

 

Robo-advisory

A short look at the companies presenting themselves at the event leads to the conclusion that the hottest sector of FinTech right now is robo-advisory. It’s so hot, that one of the panellists noted it’s getting harder and harder to differentiate for robo-advisory startups. On FinTech on Tour this sector was represented by AdviseOnly from Italy, In2experience,  Niiio, Vaamo and Fincite – all from Germany. Ralf Heim from Fincite presented an interesting toolkit ‘algo as a service’ and white label robo-advisory solutions. Marko Modsching from niiio revealed the motivation of retail customers, that “they do not want to be rich, they do not want to be poor”. Scalable Capital stressed the role of risk management in its offering of robo advisory services.

 

Social analysis/Sentiment/ Big Data

The social or sentiment analysis area, keeps growing and gains traction. Every day there’s more data and more trust in the results of backtesting as that data builds up over the years. The social media space is gaining ground. Investment funds as well as FinTech startups are finding new ways to use sentiment data for trading. And, it’s inseparably related with the analysis of huge amounts of data, so technically the systems behind it? are not trivial.

Anders Bally gave an interesting presentation about how to deal with sentiment data and showed  how his company Sentifi is identifying and ranking financial market influencers in social channels, and what they discuss.

Sentitrade showed its sentiment engine for opinion mining that is using proprietary sentiment indicator and trend reversal signals. Sentitrade is concentrated on German-speaking markets.

 

Asset management

From the area of asset management an interesting pitch was given by Cashboard, offering alternative asset classes and preparing now for a  huge TV marketing campaign . StockPluse showed how to combine information derived from social networks and base investment decisions on the overall sentiment. United Signals allows for social investing by making it possible to trade by copying transactions of chosen trading gurus with a proven track record, all in an automated way. And, finally BondIT, an Israeli company, presented tools for fixed income portfolio construction, optimization and rebalancing with use of algorithms.

 

Bitcoin and Blockchain

An interesting remark was given   by one of the panelist: ‘we have nearly scratched the surface for what blockchain technology can be applied to in financial industry’. Looking at the latest news reports that are saying that big financial institutions are heavily investing in blockchain startups and their own research in this field, there is definitely something in it.

A company from this sector of FinTech – Crypto Facilities, represented by its CEO Timo Schaefer, showed  the functionalities of its bitcoin derivatives trading platform.

 

Other fields

Hervé Bonazzi, CEO of Scaled Risk, presented its technologically advanced Big Data platform for financial institutions for risk management, compliance, analytics and fraud detection. Using Hadoop under the hood and low latency processing. Ambitious as it sounds.

Analysis of financial data for company  valuations, Valutico presented a tool that’s using big data, AI and swarm intelligence. Dorothee Fuhrmann from Prophis Technologies (UK) presented a generic tool for financial institutions to derive value and insights from data, interestingly describing indirect exposures and a hidden transmission mechanism.

Stephen Dubois showed  what Xignite (US) has to offer to financial institutions and other FinTech startups in the area of real-time and historical data that is stored in the cloud and accessible by proprietary API.

 Qumram, in an energetic presentation delivered by Mathias Wegmueller, described technology for recording online sessions on web, mobile and social channels, allowing for the analysis of user behaviour and strengthening internal security policy.

 

Conclusion

London is the place to be for FinTech startups. No city in Europe gives such possibilities. Tax deductions for investors. Direct help from the UK regulator FCA. Great choice of incubators and bootcamps for startups. No place gives such a kick. Maybe Silicon Valley is the best place for finding investor for a startup, maybe the Wall Street is the centre of the financial world, but London is the place that combines both the tech and the finance. It has a real chance of becoming the FinTech capital of the world.

 

About organizators

The people responsible for creating both a great and professional atmosphere at the event were Samarth Shekhar and Michael Mellinghoff. Michael was a great mentor of mine who transformed my pitch from a long and quite boring list of functionalities of our product to something that was bearable for the audience. Michael let me thank you once more for the time and energy you have devoted to Empirica’s pitch!

 

And because the FinTech scene in our region is not well organized yet, I sincerely advise all FinTech startups from Central and Eastern Europe to attend cyclic events of FinTech Forum in Frankfurt organized by Techfluence professionals!

 
Read about our Lessons learned from FinTech software projects.

 

 

FinTech Companies

 

 

 

Free version of Algorithmic Trading Platform for retail investors

We have just released beta of Empirica – Algorithmic Trading Paltform for retail investors! It’s lifetime free for development, testing and optimizing of trading algorithms.

Our development team (exactly this team who implemented the entire system) also provides full support in algorithms development as well as connectivity to brokers. If you need help just contact us.

Among many features what is unique is our exchange simulation where you can influence market conditions under which you test your algorithms. No others software offers such a realistic level of simulation.

In paid versions we offer the execution of algorithms in robust server side architecture.

We strive for your feedback!

Best regards,

Michal Rozanski
Founder and CEO at Empirica
twitter: @MichalRoza
https://empirica.io


Empirica Trading Platform – https://empirica.io

Our platform implemented by large brokerage house!

Empirica has successfuly finished the implementation of its Algorithmic Trading Platform in one of the largest brokerage houses in Poland.

Brokerage house will use our software to:

  • aid its internal trading operations, like market making of derivatives on Warsaw Stock Exchange
  • offer functionalities of our platform to its institutional clients, which will be able to build, test and execute their own algorithmic trading strategies

Implementation included connecting of our software system directly to the system of Warsaw Stock Exchange (Universal Trading Platform delivered by NYSE Technologies), as well as the integration with transaction systems of brokerage house. Additionally we have fulfilled and successfuly passed tests regarding the highest security, stability and performance requirements.

This implementation is an important milestone for our system. The usage by team of market makers is a proof that our system is capable of performing high-throughput and low latency operations on level required by most sophisticated traders on the capital marketets.