Thursday, September 13, 2007

Data Latency Playing An Ever Increasing Role In Effective Trading

Wall Street’s quest to process data at the speed of light relies on the physical proximity of servers to overcome the technical barriers of data latency.
By Richard Martin, InformationWeek
Wall Street & Technology
May 25, 2007

http://www.wallstreetandtech.com/showArticle.jhtml;jsessionid=MMCMY42Y0Q0QCQSNDLPSKH0CJUNN2JVN?articleID=199702208

This article explores the importance of increased communication and efficiency in the securities markets through electronic trading. Electronic trading is seen by some brokers as a threat, and this is not unfounded since there has been a one fourth reduction in the number of floor traders in the NYSE over the past year. Electronic trading now accounts for 60-70% of trading in the New York Stock Exchange.

Yet overall, automated decision-making and execution systems have increased the efficiency of the trading process and reduces huge swings in the market when a backorder of buy or sell trades are executed. Electronic trading leads to greater smoothening and stability of financial markets.

Richard Martin, the author of the article, also stresses that speed is a major issue for firms in increasing their competitiveness in the market: “a 1-millisecond advantage in trading applications can be worth millions of dollars a year to a major brokerage firm.” Reducing latency is the time it takes for orders to get processed from execution to confirmation. It could mean the difference of securing an order at the best price.

This can be achieved by locating firms algorithmic data and processing centers as close to the exchanges as physically possible. Exchanges have also benefitted from this co-location of servers with exchanges, by charging firms for this service Stock exchanges have seen a 20-30% increase in revenue from such activities.

CEO and founder of BATS (Better Automated Trading System), believes that the goal of latency reduction will continue and will be never-ending. He quotes as saying: “If anybody knows how to get a signal transmitted faster than the speed of light, I'd like to talk with them." He is a big proponent of ultra fast trading and using electronic trading platforms and feels that Wall Street needs a change. Nasdaq has also been forthcoming in its acceptance of using technology platforms and electronic trades.

This article highlights the fact that technology is a key aspect in trading and is the way to the future of stock trading. With the global direction of breaking down barriers to capital markets in terms of “political, geographical, institutional and political barriers” to increase transparent and fair markets, electronic trading is in stride with this goal. Also with technological breakthroughs there could be a possible move to 24 hour trading.

However, there comes a point where there will be diminishing marginal returns in latency reduction from sub milliseconds, and this cannot differentiate your company from the next. Customer service and other old-fashioned techniques such as market know-how will come into play again. Personally, I think electronic trading can stabilize financial markets, reduce transaction costs, and creates a more civilized trading atmosphere other than what is seen in some stock markets in an auctioning environment, where the atmosphere resembles more of a fish market. The US exchanges are in stark contrast to other exchanges around the world such as the Hong Kong Stock Exchange, where there is no auctioning pit. Electronic trading I believe creates more order in stock trading practices.

Wednesday, September 12, 2007

Still Plenty of Jobs in the Capital Markets

Link: http://wallstreetandtech.com/story/showArticle.jhtml?articleID=201800876&pgno=1

This article discusses the extincting role of traders on Wall St,. since new technology is replacing the human capital. The article talks about how increasing technology is not a threat to wall st, but its changing the role of humans, allowing companies to grow and allowing graduate students to use their analytical skills and leave the clerical work to algorithms. Approximately 500 NYSE traders were let go when the exchange moved to partial electronic trading.

The differences can be seen in different roles that humans play in the stock market. For example: Floor Brokers have transformed in to becoming "quants" reports Credit Suisse Dan Mathisson giving example of his own department where people are working with Algorithms to understand stock patterns and trade better while leaving the execution to a computer model.

The 2nd threat is between Brokers/Advisors Vs. the Web. Websites like Zecco.com are allowing individuals to achieve free trades and ananymous blog advice to trade. This reduces the job of Financial advisors and other brokers. As such, some companies say that the technology is actually helping clients and brokers at the same time. The increase in timely trading and facilitation of the process makes the buy side firms like T-Rowe Price very attractive to compete with the higher Investment Banks.

The other difference arising is between Research Analysts and Software. Companies are always looking for statistical softwares which reduces the jobs of research analysts running numbers. But in the end, what ever is necessary to save costs, whether it is outsourcing, or setting up new software algorithms will be welcome but how it shapes the industry is yet to be seen.

Information Technology vs Insider Trading---Keith Sniatecki

http://www.wallstreetandtech.com/story/showArticle.jhtml?articleID=201801138&pgno=1


When one thinks of information technology in the financial services industry, international offenses and global criminal activity probably do not come to the forefront. Information technology, however, plays a critical role in the 21st century in combating one of the industry’s most serious misdeeds—insider trading.

The SEC mandates firms monitor their employees for concerns of insider trading. There are many ways to do this, but as technology has developed, it has become both easier and more in-depth. This is evidenced by the fact that in the past year, over a dozen i-bankers, analysts, and executives have been faced with charges. There was less than this number in the entire decade of the 90s, and, let’s be honest, insider trading did not just surface in the past year.
Buzzwords associated with the tracking of potential insider trading are “highly sophisticated” and “automated.” While there is still a lot of detective-style investigative work that goes into nailing a suspect, a large part of the case is built upon these computer systems.
For instance, Sherlock Holmes never used complex event processing (CEP). CEP is an ultra-high level analyzer that uses detailed statistical measures such as event streams processing, event correlation and abstraction, and event hierarchies to detect complex patterns among many events and relationships between events. This means that it could detect a high volume of trades going on in a particular stock just before the announcement of a takeover or merger. The Financial Services Authority—the UK’s version of the SEC—just announced it will use a new technology platform to monitor suspicious activity that uses real-time graphical dashboards to provide alerts. Also available and employed by some companies in the US, this is known as an enterprise compliance system.
Another weapon in the fight against insider trading is already used by many companies in a different capacity. It’s great when an IT support person can remotely take control of your computer to fix problems, but that same technology can also be used to monitor activity or retrieve specific data. Ironically, recorded keystrokes on one i-banker’s Blackberry show that the last thing he actually entered before he was arrested on insider trading charges was the phone number to a well-known defense attorney.
We have said that financial services are one of the first adopters of new technology, therefore, it should be no surprise that those tracking misdeeds in the industry are also up on the technology. These new programs and processes prove that it is impossible to escape the long, technical arm of the law. That being said, people will still attempt insider trading. This just means that they will also need to get more sophisticated in their methods and try to hide irregularities even more. I think they will fail, at least eventually. There is such an increased focus on stomping out corporate crime in today’s tightened regulatory environment that the tracking will only increase. In this story, there were multiple law enforcement agencies working together, along with the financial regulatory bodies of each country, and the banker’s employer.

TOKYO EXCHANGE CHOOSES FUJITSU FOR NEW SYSTEM

http://www.lexisnexis.com.proxyau.wrlc.org/us/lnacademic/results/docview/docview.do?risb=21_T2038109366&format=GNBFI&sort=RELEVANCE&startDocNo=1&resultsUrlKey=29_T2038109373&cisb=22_T2038109372&treeMax=true&treeWidth=0&csi=227171&docNo=19

This article explains the introduction of new technology system to Tokyo Stock Exchange (TSE). The system of Tokyo Stock Exchange has been controversy issue this day. The Tokyo Stock Exchange greatly lost trust by a trouble over the information system that continued from November, 2005. The beginning of the thing is that exchange processed an erroneous order entered by Mizuho Securities to sell 610,000 shares of J-Com Co. at 1 yen each, instead of one share at 610,000 yen which Mizuho is still fighting with TSE over the liability for the resulting 40 billion yen loss. A system rebuilding project by the Tokyo Stock Exchange aims at the operation of the latter half of 2009 in this January and starts in earnest. On the choice of the development vendor, the Tokyo Stock Exchange carries out the first international bid and TSE chose Fujitsu from a world influential vendor.

It is to secure the rapidity of the world highest standard and reliability, expansibility that the Tokyo Stock Exchange nominated for a matter of the trading system in the next generation. The suggestion of Fujitsu which accepted an order of development spent an original new technology about these three points each, Tokyo Stock Exchange seems to have judged that it could maintain the world highest standard 10 years later. Fujitsu will try to shorten response time for an order taking about one or two seconds now within 10 milliseconds. It is important to perform business quickly to raise efficiency.

For TSE, in order to perform the world's best level, they read all application / business data on memory. In order to achieve this, Fujitsu suggested totally new constitution of memory database called the three folds node. The reason to make 3 nodes is because business data do not disappear even if an active main node performance falls. It carries out the change to the sub system from main performance system within seven seconds. They are trying to secure the trust by three fold node and backup site.

To conclude, I think that Stock Exchange must be trustworthy and speedy. As globalization goes on, the order will rise rapidly and more company from all over the world will join the Stock Exchange. In order to deal with those enormous amounts of information, it is essential to wrestle with introducing a high performance system which is speedy, trustworthy immediately. After new system introduction in 2009, what will happen to TSE and how it will influence the other Stock Exchange?

Rawan Al-Roomi - Still Plenty of Job Opportunities in the Capital Markets


In spite of trading floor automation, free online trading, and automated research tools, there are still plenty of job opportunities inthe capital markets.
By Penny Crosman
Wall Street & Technology
August 22, 2007

Personally, I always thought with the increase in technology their would by a decrease in job opportunities and that would always concern me. After reading the article uttering that there is “Still plenty of job opportunities in the capital markets.” In the article it explains that it wasn’t true when the New York Stock Exchange began a partial electronic trading that they did not need the NYSE traders because the technology was taking over.

It also embarrassed me that they have listed big companies like wall street, citigroup, bank of America, and other big companies that enlarged their trading floors, but still take in traders. The article also explain that technology is doing to things: “changing the role of humans, and it’s allowing companies to absorb growth without adding people.” After reading this I went to the airport to drop of my brother; I have an example of what I realized that airlines expanded their technology by checking in electronically, but there is still the same amount of workers. This idea saved the customer’s time, but there is still the same amount of workers.

Going back to the article, it also explains that the trader should me a hi-tech because with the growth of technology a person should be getting greater knowledge on that too. For example a lot of the brokers have moved to upper roles, but that should expand their skills--- this also goes to traders. How do the traders do that? “ They double up with people who know the stuff.”

It is upsetting to hear that traders in Wall Street fear new electronics of trade coming in because they would believe that they would be replaced with those electronic. What I believe is that those electronics would not replace the workers, but the company r firm would get faster results and the worker should get higher tech skills. Sadly online trading is concerning brokers who are facing competition from technology, because the are getting lower commission, that would result to a decrease in brokers.




Jess Roper-- Are markets really more efficient?

Article: "The Internet and the Future of Financial Markets" Fan, Ming, Jan Stallaert, Andrew Whinston. Communications of the ACM. November 2000.

The article makes a bold statement in the first paragraph, basically saying that the replacement of brokers and traders with technology has not increased the efficiency of the market. However through out the entire article, it goes on to show how technology may not have increased efficiency in the market but it definitely makes it easier for the small trader.

It has increased market efficiency in a few ways. By taking out the middle man of the broker, fees that had previously been at least fifty dollars have dropped down to no more than fifteen, making it more affordable for everyone to day trade. Technology has a huge part in this, simply taking out one more person.

It decreases human error or at least changes where the human error can occur. Technology takes out two human interfaces: the broker and the trader. Now instead of there being the possibility of miscommunication between the buyer and the broker or the broker and the trader, the only possible human error is the wrong input into a computer, putting too many zeros or not enough zeros.

While technology does a great part in matching buyers and sellers more quickly and efficiently than a human could possibly try to do, it still begs the question: why does the article claim that the markets are still not more efficient? Basically it comes down to now that we can do things faster, we want to do more with them and do more complicated things with them than ever before. Ever heard of an ETF? Those appeared not too long ago… there were less than a hundred before 2006. Investors in general are trading more often because they can constantly keep up with the market prices, the very serious ones may even have Morningstar or a similar program on their PCs in order to have the up to the second information. “Day trader” is a term that appeared in the 90s, a person who trades on the market everyday. A day trader was not possible before the internet, when an investor could look up stock prices at any time during the day.

Also, there is a much higher volume of people in the markets, making it hard for the technology to keep up with the demand. Markets that are still struggling to catch up with the technology that is demanded must also keep in mind that the technology they are using and upgrading to will also be inefficient and out dated in a matter of weeks.

In the end, has more technology made the markets more efficient? It is difficult to believe that the answer is to that question is no. There is a higher volume of more trades than ever going through the markets at a faster rate than ever. To me, that has to mean that something has changed to make people able to trade that fast. Since human beings are only capable of so much, that something that changed could not be humans at all, but the tools that humans use: technology.

Nasdaq takes stock of integration

Lamonica, Martin. "Nasdaq takes stock of integration.." InfoWorld 2314 May 2001 56. 10 Sept 2007 .


My article discusses the growing significance of the use of IT within the electronic financial marketplace, known as Nasdaq. Specifically, this article interviewed the CIO, CTO and CEO of Nasdaq in an attempt to explain the general process that is taken to invest in new technologies for this company. These three men each offered input as to how IT benefits this emerging company. This New York company has been actively expanding internationally along with full intentions of going public in the near future, which are reasons in themselves as to why its important to invest carefully in new technologies. Nasdaq allocates more than half of its expenditures into IT.
CIO, Gregor Bailar identified that nearly IT expenses are attached to a product line and a business division. Essentially, this gives Nasdaq as a whole to efficiently measure IT’s return on investment. In turn this means that Nasdaq is able to clearly see the effects of investment in IT has on the general revenue of the company.
Steven Randich, CTO of Nasdaq stressed the importance of a strong relationship between business and IT. "I've specifically initiated a number of efforts to bring technology people out to see customers - traders and build relationships with traders in order rooms on Wall Street so that they can see traders using Nasdaq workstations and other systems,"
While some business executives see IT and business two separate entities, Wick Simmons, CEO of Nasdaq disagrees with the norm. Simmons stated, “I think most everybody's now come to the point where IT literacy on behalf of anybody running a business is an absolute must."
The most interesting thing from this entire article was Nasdaq’s process towards approving new IT projects. This process first begins with an proposal or innovative idea, which can derive from any facet of the business. The idea is then taken to a business manager for further approval, once he/she considers the idea it is weighed on its return on investment determination. This means that managers and IT mangers determine the ROI and then submit these new ideas into a product portfolio. In order for these projects to succeed, these ROI measurements must fit into the company’s operating margins. Nasdaq then follows up with a quarterly business review, which examines key functions that the company needs to improve on. This review is followed up with a technology review, where IT managers take the time to examine its already existent infrastructure. This process gives Nasdaq an opportunity to consistently stay up to date with their technologies, which they intergrate into their daily business functions.
Considering that Nasdaq has future plans to go public, one can see why this company is so careful with introducing new technologies within their company. The chance that a new investment’s opportunity cost would not pay off beyond their initial investment could be discouraging to its potential future investors, but more importantly to its already established service. This means that Nasdaq’s leadership is going to continue to embrace business and IT in the same hand; however, they will be measuring every proposal so that they can continue to offer exceptional service for future investors.

Tuesday, September 11, 2007

John-Synthetic CDOs: Managers bet on long/short CDOs to deliver

Synthetic CDOs: Managers bet on long/short CDOs to deliver
Louise Bowman. Euromoney. London:Jul 2007. p. 1

http://proquest.umi.com.proxyau.wrlc.org/pqdweb?RQT=302&COPT=REJTPTRhNjEmSU5UPTAmVkVSPTI=&clientId=31806&cfc=1?did=1313080421&Fmt=3&clientId=31806&RQT=309&VName=PQD

This article explains the movement towards long and short CDO strategy to try to buy and sell stock when there is a wide spread amongst the stock. Before reading the article I was a bit unclear as what they meant by shorting. After reading the article, I understand that when you short stock you are betting on the stock not doing well. I can understand why a company would want to do this but the risk it seems is unlimited.

The article goes on to say the strategy for the last four to five years has been using a long corporate credit strategy to correct a declining credit market. The market is facing this decline due to subprime. From my understanding subprime has to do with banks giving out loans to people who do not actually qualify for the best interest rates. Now the banks are asking for their money back and the people cannot pay the loans, which is causing the market to decline. The strategy to short the market seems like a good idea but I wonder if it is just a short-term goal? The article also mentions that the long/short term techniques is very challenging and I was wondering is it worth taking the risk?

It was interesting to find out that agencies tend to penalize short buckets in CDO structures because shorting is so costly. I can understand why because again the risk in unlimited. In addition, I did not know that Deutsche has already sold two hybrids long and short deals.

BNP Paribas the article explains has found ways to move around the systemic widening that usually results in a lost in terms of carry. If BNP Paribas continues to innovate this new strategy it might be possible to implement the strategy to hopefully correct the current market condition and eliminate or slow down subprime. In addition, the new long and short strategy the article goes on to say allows trading not just on credit but also maturity of the stock. That would be useful when buying or selling stock.

Furthermore, the article argues that the different maturities will allow people to take advantage of the spread. However, there is not an efficient way to price the CDOs using the standard base correlation, which could be a problem. Firms for now at least may or may not get the best price when using the new strategy until an efficient pricing standard is set in place.

I was surprised to find Goldman Sachs using this new strategy by selecting a long and short portfolio that is equal in size. What I did not know was that the long portfolio has a wider spread than a short portfolio, about twice as wide. The article mentions that the aim is to benefit from possible spread widening rather than actual default.

After reviewing this article, I feel the strategy is a big gamble now but I think it could be successful once all glitches are fixed. It seems like the strategy in theory is a great idea but at the same time, there is still a huge risk when trying to short to correct subprime.

Ajai Murali - Are you Being Served?

Article URL (PDF):


Are You Being Served? Article Link


This article by Emily Fraser discusses various types of programs and their advantages and disadvantages in front office, middle office, and back office environments.

Firms are looking for further ways to reduce latency in application timings in a high volume market in which sub-millisecond times play a significant role.

SOA is one design philosophy that is mentioned throughout the article, short for service-oriented architecture. Service-oriented architecture allows firms to reuse software, by allowing applications borrow pieces of code and interact with any other application, to save time and money. It is apparently very efficient in running middle and back office functions, and works well across legal entities and operating units.

A shift needs to take place, according to the author, from service-oriented architecture to something new for front-office applications, since extremely low latency is desired with real time updates and streaming data. A shift needs to take place to higher-performance computing based on market/transaction data.

SOA runs primarily on XML, and though XML provides a dynamic environment for application development, it is not particularly well suited for low latency applications. It may be too loaded with information/brackets/slashes that are not necessary with basic data figures from daily market transactions.

One solution the article mentions is a new algorithm known as event stream processing. HSBC is currently developing a platform in which the back and middle offices are controlled by SOA and the front office is run with ESP. The head of HSBC argues that ESP is really only very useful for the low=latency requirements of the front office, but SOA contains powerful elements that support middle and back offices in a more productive manner.

ESP is an emerging technology that is already at the forefront of becoming a standard application amongst trading firms, and Is becoming more useful for things like data monitoring, compliance, TCA, and risk management. The next big mainstream thing is supposed to be event-driven architecture much like ESP, but it still has a long way to go until smooth implementation possibilities.

There are some advantages to SOA, however. SOA is more useful for applications in which instant response is not absolutely critical. It is also more flexible and able to extend life from legacy systems. Since SOA borrows code from other applications, legacy systems can borrow code for updated functionality. Since mergers and acquisitions are ever-present in the financial world, SOA is easier since it borrows code and makes it easier for firms to consolidate applications.

I believe that SOA is a good move for its functionality, flexibility, diversity. Though event stream processing is a new technology, it must be regarded as that and not implemented until it reaches full maturity. New firms can experiment with it and eventually it will reach old systems. SOA is too common to replace at the moment, and replacing it could sum up to be an immense amount of money. Software algorithms will always build on themselves, and thinking back 10 years, SOA was not even common. So, ESP will eventually become a norm, but SOA is too easy and efficient at the moment.

Monday, September 10, 2007

Moronta- Sony Teams Up With Isilon

Sony Pictures, Sony Online Entertainment and Sony Pictures Imageworks Among Sony Business Units Tapping Isilon to Power Breakthroughs in Digital Entertainment Production and Distribution :Clustered Storage Deployed to Accelerate Creation, Management and Delivery of Digital Content to Consumers Worldwide. (10 September). PR Newswire. Retrieved September 10, 2007, from ABI/INFORM Complete database. (Document ID: 1333106141).

"Sony Pictures, Sony Online Entertainment and Sony Pictures Imageworks Among Sony Business Units Tapping Isilon to Power Breakthroughs in Digital Entertainment Production and Distribution; Clustered Storage Deployed to Accelerate Creation, Management and Delivery of Digital Content to Consumers Worldwide"

This article targets Sony's perspiring ambitions to become the world's leading all-around entertainment company. I believe that Sony made an excellent business move by calling on Isilon to supply its immense storage needs. Isilon systems is a world-leader in clustered storage systems and software for digital content. This move had to be made with all the new technologies coming out in Sony's arsenal.

I like the sound of Isilon for Sony because it condenses all their information and makes it much more accessible and easier to attain at higher speeds. According to the article, this move is going to drastically reduce Sony's costs and many complications that have occurred before with storing all their information. Sony will be able to shell out technology and products to all corners of the world.

This is good news for consumers and shareholdlers. With Isilon providing storage solutions many of the latest technologies should be easier to produce, and in turn it should lower the costs of these products. Sony has always been a juggernot in entertainment, but many people can not afford products such as Playstation 3. This drastically reduces their expected revenue. What boggles my mind is that the Playstation 3 system costs 800 dollars to produce, but then they sell it for 600. They lose a massive amount of money on the the systems, and they hope they can make it back with the sale of the games. To me this does not make sense because if the company does not sell enough systems like they predict, who is going to buy the games that are to provide the profit? Hopefully now products will be produced more efficiently, with reduced costs, and they will be more attainable for people world-wide.

As for stockholders, I feel as though they should look into investing in Sony. Shareholder value should now be maximize due to the simplicity and efficiecy of the system that Isilon has provided for Sony. Sony can continue to make innovations now that their storage problems are resolved. Sony stock before has been an ok stock but now I feel as though it has the potential to be a great money maker for investors.

This company is on the uprise, but its furture lies in the hands of those in the management positions. They most make smart and ethical business decisions to keep Sony's freight truck cruising foward, and its employees and stockholders happy.

Sunday, September 9, 2007

Phil - Finance And Economics: Ahead of the tape; Algorithmic trading

The Economist. London: Jun 23, 2007. Vol. 383, Iss. 8534; pg. 99


Link:

http://proquest.umi.com.proxyau.wrlc.org/pqdweb?RQT=302&COPT =REJTPTRhNjEmSU5UPTAmVkVSPTI=&clientId=31806&cfc=1?did=12930 27981&sid=13&Fmt=3&clientId=31806&RQT=309&VName=PQD


In this article it describes how computer programs are able to generate buy and sell orders and make trades faster then regular traders. Even though it is a sort article it goes in to detail on important things like how “Dow Jones and Reuters, the news providers, now offer electronically “tagged” news products that algorithms pick up to make programmed trading decisions.” The Britain Financial Services are also trying to get algorithms to come through trading data.

London Stock Exchange already uses this technology and “on its first day processed up to 1,500 orders a second, compared with the 600 using its precious system.” These programmed algorithms will help investors and brokers find useful information when buying and selling stock. Also because of the processing speed it can presses the large sized buying and selling in a matter of minutes or even seconds. In America it is estimated that $480 million this year alone in developing the technology for algorithmic trading in the short run. In the long run however having that algorithmic function servers near the area of trading could save milliseconds of time it takes to trade and so both sides get a better price.

A problem with these algorithms though is that while it is looking through past transactions and the news it may mistake a news article’s title, an example in the article suggests that the word “surprise” may mean the numbers are better or worse then they really are, according to Andrew Silverman of Morgan Stanley. Using words like share price or volume may be better words to use in the algorithm, Mr. Silverman suggests.

With this technology trading can be done a lot easier and faster then it is being done now. This article helped explain a lot, but left out how it was made, how can it be best protected against hackers who may mess up the algorithm. Another thing that was not mentioned in the article is if this technology will be shared to the individual stock holder or just the businesses.

Fatou Coulibaly - Breaking Out Article.

Citation: Anita Hawser, Breaking Out, Global Finance. New York: Apr 2007. Vol. 21, Iss. 4; pg. 61, 3 pgs

Breaking Out

This article explains the progressive emergence of Indian companies, especially Indian IT firms on a global setting. The IT companies, as well as software-related service corporations are experiencing a great advancement concerning the development of their technology. The revenue growth resulting from this improvement enables them to expand their investment and foreign acquisitions throughout the world. These companies are not only willing to participate into global trade, but they most importantly have the capabilities and required funds to undertake such operations and investment activities.

What is interesting about this article is that it discusses the case of specific Indian companies involved in this trend. It also provides an explanation and justification of the ambitions of the Indian companies to buy global firms. I was exceptionally surprised to read on the article that the US-Canadian company Novelis (the world biggest aluminum producer) was bought by an Indian company called Hindalco Industries.

An example of the success of Indian IT service firms internationally is the listing of the first Indian company named Infosys on the elite Nasdaq-100 club in 1999. This achievement is exceptional because usually great companies such as Apple Computer or Cisco Systems are on this list. I found that it is encouraging for investors to know this about Indian firms because they would want to invest more on them.

This recognition has been possible because of the development of information technology services in India. Even if these companies have the necessary ability (monetary and financial capability) to purchase or trade acquisitions on a worldwide setting, they most definitely have to possess the information technology means to undergo such process. Thus, I believe the ambitions exist, but the technology is also available for them to benefit from such transactions, and this is very encouraging. More Indian companies will be influenced by this, and then will want to get their names out, expand their business portfolios and strive to be more integrated in the global trading system.

Another Indian company, Indusview is favorable to this trend because it “predicts that India's IT and software-related services sector will continue to perform strongly, reaching $21 billion in revenues in 2007.” If this prediction proves to be true, this will be a significant milestone for the country’s economics and financial standing in the world. The IT firms will keep on meeting the investors’ expectations, and in this perspective attract more investment possibilities at their favor. The one last prediction in the article which really persuaded me into believing in the future success of such companies came from the American investment bank Goldman Sachs, which “predicts that India has the potential to achieve rapid economic growth over the next 30 to 50 years if development proceeds successfully.”

Finally, I was surprised and glad to learn in the article that Indian companies are expressing an effective and efficient “entrepreneurial dynamism” in order to get ahead financially. An example of the entrepreneurship is the act of an Indian named Jignesh Shah, “who reportedly mortgaged his home to raise the seed capital for his company, Financial Technologies (India) Ltd, which went on to build a trading system for the Bombay Stock Exchange and India's first Multi Commodity Exchange.” In synopsis, the Indian financial and IT companies are not only ambitious but they have entrepreneurs who are succeeding in what they are doing so far.

Friday, September 7, 2007

Jessica Davison - Competition on the Nasdaq and the Growth of Electronic Communication Networks


Citation: Fink, Jason, Kristin Fink, and James P. Weston. "Competition on the Nasdaq and the Growth of Electronic Communications Networks." The Journal of Financial Services Research (2004): 1-37.

Link:

Competition on the Nasdaq and the Growth of Electronic Communication Networks


This article discusses the growth of ECNs which are electronic trading systems and how this growth has affected the Nasdaq. “Competing directly with Nasdaq dealers, ECNs offer a low-cost and anonymous alternative to traditional trading”. What I liked about this article is that it examined the affect of ECNs on stocks over a six year time span in order to better understand what was changing in the market.

I found a number of things fascinating about this article. First, I had no idea that ECNs allowed anonymous trading, I honestly never considered that using a traditional broker forced people to give up their anonymity. The article pointed out that people are more likely to place orders if they have anonymity, “even though they are usually smaller orders and have lower fill rates compared to traditional dealers”.

This does not make complete sense to me. I understand why more people would want to trade if they had the option to trade anonymously, but I do not understand why it is mostly smaller transactions. This also raised a question in my mind: with the use of ECNs does it become more difficult for the SEC to regulate transactions? While this article does not focus on this issue, I hope to find future articles or information that discusses this further.

This article was also interesting because it had evidence that prior to the use of ECNs the Nasdaq markets were often not efficient and often shut out a large segment of the population due to collusion. I found it very interesting that between new legislation passed around 2000 and the growing use of ECNs that the market has now become more competitive, more efficient, and now more people appear to have a more fair chance to get in and have the chance to make money.

Given our class discussion last week, I started to almost feel bad for some of the brokers who are losing business to the ECNs. However, this article points out that this IT development has helped more people become involved in trading, which only benefits the Nasdaq, trading community, and economy as a whole through increased competition which is allowing more people to be able to enter the market.

However, despite the conclusions drawn in this article, that ECNs are benefiting the market as a whole, there are some negative effects considered. For one, ECNs fragment the market so that orders are placed through many different centers instead of one central place, leaving the “potential to reduce overall market quality”. Anonymity is also a potential negative along with forcing some traditional dealers out of the market. However, although the article stated that although ECNs “may have a negative effect on the ability of firms to profitably purchase order flow,” ECNs have improved the “long-run structure, conduct, and performance of the Nasdaq market.”

This article is extremely interesting, it shows how research was conducted and clearly spells out the results of this long term study. Prior to reading this I was a little on the fence about whether ECNs were an overall positive development. However, now I feel much more confident that the development of this technology is overall a positive step for the market.

Tuesday, September 4, 2007

Top of Mind Posting from Prof Klein

Anita Hawser, Open Access, Global Finance, June 2007: 21,6 page 37

"On demand software services are making increasingly sophisticated treasury management systems available to a much wider range of companies..."

This article describes options available to corporate treasurers for managing the regulatory reporting required for Sarbanes-Oxley and Basel II. What surprised me was the extreme prevalence of spreadsheets across all businesses, large and small, to perform a wide variety of reporting including cash forecasting. According to this article, over 60% of international corporate treasurers using spreadsheets. Why? Because it appears to be cost effective, that is, until something goes wrong. The upfront investment associated with a Treasury Management System (TMS) has been estimated at almost $200K for six users. A PC-based spreadsheet with some file back and recovery costs virutally nothing, unless you consider the cost to maintain the macros and format of the spreadsheets.

TMS software has been available to companies for decades and yet its implementation is not wide spread, especially in small to middle market firms. The upfront cost is clearly one significant hurdle. What I also find interesting is that early instances of application server provider (ASP) solutions were viewed as watered down versions of the licensed TMS solutions. Today, many vendors offer the identical platform to ASP customers as the licensed version. The good news, it should be easier, faster and less expensive to deploy for most firms.

This article does not fully account for the implications of implementing a major process change. Imagine the employee who measures their value by the spreadsheets they control. Using a TMS is likely to eliminate these holders of sacred data. And, somehow companies must battle the "not invented here syndrome" which serves to get companies to look at their uniqueness and areas of similarity. Many companies, especially large ones, are reluctant to see that they have the same business requirement and ultimately process. Information security and privacy must also be considered. That said, small to mid-size corporate treasurers must spend so much time on compliance related issues that ignoring the ASP delivery model for TMS solutions should be considered. Open Access serves to open the eyes of corporate treasurers to alternative ways to support the compliance reporting for its treasury operation.