Top stories

Risk Focus investors quick to commit to initial funding round

Jun 27, 2016

London & New York: Risk Focus Inc., the leading provider of regulatory reporting control and compliance solutions to the global capital markets, raised USD 750,000 through the issue of a convertible note to a group of private investors, in a round that was led by Jon Barlow, FinTech entrepreneur, investor and advisor. This is the first time that Risk Focus has raised external funding, having been self-funded and profitable since it was founded in 2004. The note is capped at USD 2 million, allowing Risk Focus to further increase funding in this round.

read more

eXtremeDB Financial Edition DBMS Sweeps Records in Big Data Benchmark

Jun 27, 2016

McObject®, developer of the eXtremeDB® Financial Edition database management system (DBMS), announced a sweep of records by McObject’s DBMS running on an IBM POWER8 S824L Linux server in the Kanaga suite of STAC-M3™, key financial industry benchmarks of tick analytics on Big Data. The test results, audited and published by the Securities Technology Analysis Center (STAC®), position eXtremeDB Financial Edition as the most predictably fast and scalable tick data management solution for trading systems contending with high data volumes and increasingly complex queries in today’s capital markets.

read more

Saxo Bank's morning comments on Brexit trading

Jun 24, 2016

Going into the UK referendum we have considered it important to take prudent measures to reduce our clients’ exposure to risk and to be fully transparent. It has been essential for Saxo Bank to explain to our clients that neither clients nor Saxo Bank benefit from overleveraging and that we, with our clients' interest firmly at heart, are doing our utmost to educate on the range of options available.

read more
All news

Latest Blog Posts

RegAT – round 2

Henri Pegeron, Fidessa

Jun 22, 2016
photo
Regulation AT forges ahead, this time with a Round 2 panel convened on June 10th at which the CFTC focused the discussion on five major sticking points: 1. Amendments to the proposed definition of direct electronic access (DEA) 2. Quantitative measures to establish the population of persons 3. Alternatives to require each person defined as an “AT Person” to implement and utilize pre-trade risk controls 4. AT Persons’ compliance with requirements under RegAT when using third-party algorithms or systems 5. Source code retention and access Round 2 is a welcome step in the right direction, and by revisiting these topics with the people who know them best, we can take a proactive step towards demystifying RegAT. The choice of topics also reveals a sense of urgency in getting RegAT moving, and soon. That sentiment was echoed by CFTC Chairman Massad at a conference in New York earlier this month where he expressed a willingness to split RegAT into more digestible parts. The next question is what to expect in Round 3. The untold story of RegAT is that the effectiveness of US-based regulation can so often depend on the person serving as US President at the time of its implementation. RegAT faces a race of its own, then, to get the rules finalized before the current political uncertainty materializes into a reality in November.
read more

At last, it’s time to industrialise the back office

John O’Hara, Taskize

Jun 21, 2016
photo
Standardisation and automation are the logical response to problems of scale and complexity. If a simple task needs doing a couple of times a year, you have little incentive to improve the process. If a complex task has to be executed every few hours, minutes or even seconds, you soon look for ways to systematise the management of the task, to get it done quickly and efficiency. This principle has driven progress from the industrial revolution to the era of the production line to today’s trading rooms, where order and execution management systems have evolved to cope with vastly increased transaction volumes across multiple asset classes and counterparties. But it only applies if labour costs make it more efficient to automate a process than to hire more people to handle increased volumes or complexity. Securities settlement was relatively slow to be automated in India, for example, because little marginal cost was added by banks employing extra staff to courier share certificates between counterparties. Similarly, broker-dealers globally have regularly increased back-office headcount to handle periodic rushes to meet new regulatory obligations or unexpected upsurges in demand. As surveillance requirements increase under the EU Market Abuse Directive and MiFID II, banks and brokers are recruiting legions of para-legal staff, often seconded from law firms or consulting groups, to plough through voice recordings of trading desk conversations, when required to provide further data on trades under investigation by regulators. At the same time, firms are looking to automate and standardise their surveillance processes, but investment cycles and reduced budgets mean they must also deploy short-term, labour-intensive and ultimately unsustainable solutions. Today, the pressure on broker-dealers to standardise and automate back-office tasks is reaching breaking point. Largely, the pressure stems from regulatory drivers, not an increase in transaction flow, meaning that extra budget to quickly facilitate process improvement is hard to come by. Overwhelmingly, the aims of regulatory reform - greater transparency, stability and efficiency - are laudable, but collectively they represent an unprecedented squeeze on resources. For example, the Central Securities Depository Regulation has already cut securities settlement cycles in Europe to two days from three, giving back-office staff less time to fix the same number or errors or exceptions (indeed more, as many clients have turned to broker-dealers to help them handle the transition). Soon the same regulation will impose mandatory buy-ins, reducing the timescales permitted to counterparties to find a security in the event of a trade fail. Separately, the volume of collateral transfers is expected to increase substantially, as counterparties to both cleared and non-cleared OTC derivatives are obliged to formalise margin arrangements. Even if collateral transfer fail rates remain at 3%, a recent study by PwC and DTCC-Euroclear GlobalCollateral Ltd has estimated that brokers will need to increase dedicated staff from three to 16 between 2015 and 2020, just to manage the extra workload from non-cleared OTC derivatives trades. Faced with so many pressures to standardise and automate back-office processes, the COO is painfully aware that traditional options do not fit the bill. COOs cannot afford, in any sense of the word, to brief a team to conduct root and branch, multi-year, multi-asset class, multi-jurisdiction process automation projects. But the cupboard of ideas is far from bare. Just as order management systems extracted the core data from a portfolio manager’s instructions to the trading desk, reducing lengthy conversations to a few standardised data fields, it is possible to strip down the current cumbersome methods used to outline the actions needed to fix a failed transaction or settlement snafu. Similarly, the execution management system now sends precise instructions to multiple broking counterparts on how and where to execute a trade, replacing time-consuming and somewhat vague phone calls to various sales traders. Like executing an order, resolving a collateral fail or buy-in is a complex, multilateral activity. But the difference today is that the automation and standardisation of back-office tasks need not take as long as the evolution of OMSs and EMSs, via development of message protocols and extensive industry-level collaboration. As such, a task management system – or utility - for the back office is not a far off dream, but a tool that could rapidly reduce manual intervention. In many respects, the hard miles of back-office standardisation and automation have been travelled, thanks to the networked connectivity that already exists between counterparties, and the new generation of IT communications capabilities that can support the shared terminology, directories and templates needed to process complex back-office interactions on a more cost-effective and scalable manner than broker-dealers have achieved to date. While some of the technology infrastructure to facilitate back-office task automation and standardisation is relatively new, the enabling, driving force is the will to change. Higher levels of standardisation and automation will not only support complex processes and relationships more efficiently, it will help broker-dealers achieve broader regulatory objectives of transparency and stability. As such, COOs need to look beyond traditional approaches to consider how fintech-based solutions can tackle today’s problems today, not tomorrow. We’ve thrown people at the back office for too long; now is the time to give them the tools to do the job better.
read more

Data lakes vs data streams: which is better?

Guy Warren, ITRS Group

Jun 20, 2016
photo
Data lakes and data streams: two of the hottest data buzzwords du jour and as likely as any pair to spark an argument between data scientists backing one or the other. But which really is better? Firstly, what are these lakes and streams? A data lake is still a fairly new concept that refers to the storage of a large amount of unstructured and semi structured data. It addresses the need to store data in a more agile method compared to traditional databases and data warehouses, where a rigid data structure and data definition is required. The data is usually indexed so that it is searchable, either as text or by a tag which forms part of the schema. The flexibility-factor is that each new stream of data can come with no schema, or its own schema, but either way can still be added to the data lake for future processing. Why is this useful? Because businesses are producing increasing amounts of useful data, in various formats, speeds and sizes. To realise the full value of this data, it must be stored in a such way that people can dive into the data lake and pull out what they need there and then, without having to define the data dictionary and relational structure of data in advance. This increases the speed at which data can be captured and analysed, and gives much more flexibility for adding new sources to the lake. This makes lakes much more flexible than traditional storage for data scientists or business analysts, who are constantly looking for ways to capture and analyse their data, and even pour it back into the lake to create new data sources from their results. Perhaps someone has run an analysis to find anomalies within a subset of the data and has then contributed this analysis back to the data lake as a new source. However, to get the best out of a complex data lake, a data curator is still recommended to create consistency and allow joins across data from different sources. A data stream on the other hand, is an even newer concept in the general data science world (except for people who use Complex Event Processing engines which work on streaming data). In contrast to deep storage, it’s a result of the increasing requirement to process and perform real-time analysis on streaming data. Highly scalable real-time analysis is a challenge that very few technologies out there can truly deliver on...yet. The value of the data stream (versus the lake) is the speed and continuous nature of the analysis, without having to store the data first. Data is analysed ‘in motion’. The data stream can then also be stored. This gives the ability to add further context or compare the real-time data against your historical data to provide a view of what has changed – and perhaps even why (which depending on your solution, may impact responsiveness). For example, by comparing real-time data on trades per counterparty against historical data, it could show that a counterparty, who usually submits a given number of trades a day, has not submitted as many trades as expected. A business can then investigate why this is the case and act in real-time, rather than retroactively or at the end of day. Is it a connection problem with the counterparty, is the problem on the business’ side or the client’s? Is it a problem with the relationship? Perhaps they’ve got a better price elsewhere? All useful insight when it comes to shaping trading strategy and managing counterparty relationships. The availability of these new ways of storing and managing data has created a need for smarter, faster data storage and analytics tools to keep up with the scale and speed of the data. There is also a much broader set of users out there who want to be able to ask questions of their data themselves, perhaps to aid their decision making and drive their trading strategy in real-time rather than weekly or quarterly. And they don’t want to rely on or wait for someone else such as a dedicated business analyst or other limited resource to do the analysis for them. This increased ability and accessibility is creating whole new sets of users and completely new use cases, as well as transforming old ones. Look at IT capacity management, for example; hitherto limited to looking at sample historical data in a tool like a spreadsheet and trying to identify issues and opportunities in the IT estate. Now, it is possible to compare real-time historical server data with trading data, i.e. what volume of trades generated what load on the applications processing the trades. It is also possible to spot unusual IT loads before they cause an issue. Imagine an upgrade to a key application: the modern capacity management tools can detect that the servers are showing unusually high load given the volume of trades going through the application, catching a degradation in application performance before a high trading load causes an outage. In the future, by feeding in more varied and richer sources of data (particularly combining IT and business data) and implementing machine learning algorithms, it will be possible to accurately predict server outages or market moves that could trigger significant losses if not caught quickly. So: which is better, a data lake or a data stream? The answer is both. Businesses need to be able to process and analyse data at increasingly large volumes and speed, and from across a growing number of sources as the data arrives in a stream, along with the ability to both access and analyse the data easily and quickly from a data lake. Historically, the problem has been that standard tooling doesn’t easily allow for mixing these two paradigms – but the world is changing!  
read more
All Blog Posts

Interview

MiFID II – the good, the bad and the regulatory

Jun 27, 2016
photo

ATMonitor talks with Christer Wennerberg, Head of Market Structure at Itiviti. Wennerberg discusses the main differences between equities and derivatives markets in regards to regulation, fragmentation and the implementation of MiFID II.

read more

MAR – What you need to know

Jun 22, 2016
photo

ATMonitor talks with Johannes Frey-Skött, Principle Software Engineer at Itiviti. Frey-Skött discusses the implementation and components of a complete MAR solution.

read more

Talking Trading with Itiviti

Jun 21, 2016
photo

ATMonitor talks with Chris Anderson, Senior Product Manager at Itiviti. Anderson discusses what sets Tbricks apart from other trading solutions, as well as current trends within the market and the challenges faced by clients.

read more
All interviews

Survey

Execution Management Systems Survey

Trading Survey Now in its fourth year running, The TRADE magazine in conjunction with ATMonitor, is once again running its industry leading survey of Execution Management Systems for 2016. If you are trading electronically, we invite you to comment on your use of execution management systems, which features you consider important and how you rate their current capabilities. All submissions are reported in aggregated and anonymous format. Please rate your EMS vendors by completing the online questionnaire available here

read more
All surveys

Research

East meets West - Chinese brokers continue international expansion

Fidessa

May 04, 2016
East meets West - Chinese brokers continue international expansion

Examining the continuing expansion of Chinese brokers into international markets and the impact on the global financial landscape.

read more
All research

Video showcase

Corvil working with RSJ

Corvil Watch Michal Sanak, CIO, RSJ Algorithmic Trading discuss working with Corvil. read more

Corvil working with Tradition

Corvil Watch Yann L'Huillier, CIO, Tradition and Alex Krovina, CTO, Tradition discuss working with Corvil. read more

All videos