Friday, April 23, 2010

Delphi development offered for Microsoft Visual Studio 2010

Embarcadero will offer next month an upgrade to its Delphi Prism platform, enabling Object Pascal-based development in Microsoft's newly shipping Visual Studio 2010 IDE.

Delphi Prism 2011 is built on top of the Visual Studio shell and can be used either with Visual Studio 2010 or independently. The Visual Studio shell features the Visual Studio IDE framework as well as editors and editors.

"Delphi Prism 2011 is actually our .Net solution for the Delphi and .Net world," said Mike Rozlog, Embarcadero product manager for Delphi, which was once a Borland technology. Embarcadero acquired the former Borland development tools unit CodeGear in 2008.

Object Pascal is "considered to be one of the easiest languages to learn," in the .Net space, Rozlog said.

"Object Pascal in the .Net environment can take advantage of everything .Net has to offer," Rozlog said. Developers can build .Net. ASP.Net and data-driven applications.

When coupled with Visual Studio 2010, Delphi developers can take advantage of everything in Microsoft's software development platform, including capabilities for Silverlight; .Net Framework 4, and Windows Presentation Foundation, according to Embarcadero. Parallel processing enhancements in .Net Framework 4 are supported.

Developers also can build for Windows 7 and the Microsoft Windows Azure cloud platform.

Also featured are a redesigned IDE with improved usability and enhanced code editing.

Besides .Net 4 support and Visual Studio 2010 integration, the 2011 version of Delphi Prism features enhancements for aspect-oriented programming as well as language enhancements. An obfuscator is featured to make it more difficult to reverse-engineer code, Rozlog said. Developers also can take C# code and have it automatically converted to Prism code.

Delphi Prism 2011 offers improved connectivity to InterBase and Blackfish SQL databases. Developers also can build clients in .Net for the Delphi DataSnap application server.

Also featured with Delphi Prism 2011 is Novell's MonoDevelop IDE for building a Mono-based application development via Delphi. Mono is a runtime that allows developers to use .Net-based skills to build applications for platforms including Linux and Mac OS.

Embarcadero did not have information available Tuesday about the price of Delphi Prism 2011. The company plans to offer a new version of Delphi, for Win32 development, probably in the second half of this year.

This article, "Delphi development offered for Microsoft Visual Studio 2010," was originally published at InfoWorld.com. Follow the latest developments in business technology news and get a digest of the key stories each day in the InfoWorld Daily newsletter and on your mobile device at infoworldmobile.com.


By Paul Krill

Created 2010-04-20 02:14PM

Why Data Modeling is Now Critical to SOA Success

When SOA came on the scene, it promised to revolutionize how data is accessed within applications, across organizations and across the Web; basically anywhere it was needed.

Promoting the ultimate reuse of data and harnessing the rapid data growth were other promises of SOA. Rather than duplicating data from one system to another, SOA provided cleaner ways to access the data directly and reuse it. It was supposed to turn spaghetti-like webs of disparate systems with one-off, proprietary interfaces into an orchestrated access layer that could ask for data from anywhere and put data back seamlessly, while being more agile to changing business demands.

While SOA has accomplished this, it has also created some new challenges. How is this new data “source” documented? How is it governed? Who’s accountable to maintain quality and traceability to the back-end databases? At some point, the data in the SOA layer or enterprise service bus has to end up back in the database. If no standards are leveraged in the SOA infrastructure, integrating and sharing data can be problematic enough without it returning you to where you started with time and money wasted.

Data lives in more places that just databases. SOA has been invaluable in enabling its re-use and controlling data redundancy that can plague organizations. The backbone of Web services and SOA is XML and, more specifically, XML schemas (XSD). XSD development still elicits images of the "wild, wild west" where you build whatever you need with very little thought about reuse and standards. For the most part, XSDs have been created and managed by developers, not data architects. Developers typically they work on one project at a time, and typically do not think about enterprise-wide standards and ensuring data stored in one place is defined the same way as like data stored everywhere else.

As a result, not only can you have different representations of the same data in the SOA layer, but the version of the same data in the SOA layer can diverge greatly from data in source systems.

The XSD language also has different standards for how data is typed that provide a lot more freedom than database DDL. Precision and scale are optional on most data types. The maximum length is different between like data types like strings, dates and integers. Primary, foreign and check constraints are also treated differently. This can lead to drastic differences between the structure of the XSD and the back-end databases. If the source and target rules are not carried over to the XSD definition, it can cause many errors or, even worse, it can result in data loss as the payloads are messaged between systems.

One approach to the issue is to involve data architects in XSD development. The architects can leveraged their data models to create the XML structures much like they use data models to create databases. Models have been used to govern database development for years, so why not use the same modeling processes on XSDs? Obviously, it will make your life easier to employ a data modeling tool that provides some level of custom mapping between the logical/physical model and XSD. Once that is in place, it enables the architects to control the structure of the XML and reuse the same data elements and definitions for both database development and XML development.

XML, by nature, is hierarchical. Meanwhile, the popular database platforms are relational. This presents new challenges for trying to use data models as the common language, since they are also mostly relational. However, it is what most data architects are familiar with. Throw an XSD development tool at them and they will be a fish out of water. But if you give them their data modeling tool, they’ll be happy as clams. They will be able to play by their rules. They will understand the layout, the notation and they will be able to apply their knowledge very quickly.

To find a happy medium, most architects using modeling tools to develop canonical data models to represent the XSDs. Canonical in mathematical terms means "of simplest or standard form," and that is exactly what these models are. They rest somewhere between a logical and a conceptual model. Some parts are normalized and some parts are denormalized. Most of the time, the intention is never to generate a database or DDL with a canonical model. The intention is to reuse the data elements from the database-driven logical models in the XSD-driven canonical models. This provides two things. First, you can leverage your existing investment in your data modeling tool. Second, you will save a lot of time working in a familiar environment.

One of the most important keys to success is to let technology and software do the heavy lifting. Most sophisticated data modeling tools enable you to reuse data elements and break a large model into smaller submodels or subject areas. This is critical for aligning the canonical models with existing standards, as well as parts of the canonical model with the messages that are passed between systems. Even better, data modeling tools will help you to selectively generate custom XSD code directly from the logical or physical.

By Jason Tiret, Director of Modeling and Architecture Solutions, Embarcadero , 04/08/2010

Thursday, April 22, 2010

REST and SOAP: When Should I Use Each (or Both)?

Web developers today have a myriad of technologies they can choose from; everything from simplified database access, to easy wrapping of existing middleware services, to a plethora of interesting client side software. All of these products and tools are there to give web developers the ability to create the best web-based applications in the shortest amount of time.

However, having a massive set of possible software solutions is one challenge, picking the specific approach for specific parts of the web applications is another, and web developers today have to juggle many of these decisions with changing standards or approaches seemingly appearing daily.

Take for example, the two approaches for interfacing to the web with web services, namely SOAP (Simple Object Access Protocol) and REST (Representational State Transfer). Both approaches work, both have advantages and disadvantages to interfacing to web services, but it is up to the web developer to make the decision of which approach may be best for each particular case.

By now, most developers have at least, from a periphery, been exposed to the REST approach, which uses a standard URI (Uniform Resource Identifier) that makes a call to a web service like http/https://www.mycompany.com/program/method?Parameters=xx. The approach is very simple to understand and can be executed on really any client or server that has HTTP/HTTPS support. The command can execute using the HTTP Get method. So developers that use this approach, cite the ease of development, use of the existing web infrastructure, and little learning overhead as key advantages to the style.

However SOAP, the granddaddy of all web services interfaces, is not going away anytime soon, and in fact with the introduction of SOAP 1.2 has fixed many of the perceived short-comings of the technology and pushing it to new levels of both adoption and ease-of-use. It should also be noted that the acronym SOAP no longer stands for Simple Object Access Protocol as of the 1.2 specification from the W3C organization; it is now just the name of the specification.

Now keep in mind that using SOAP 1.2 has some additional overhead that is not found in the REST approach, but that overhead also has advantages. First, SOAP relies on XML (Extensible Markup Language) in three ways; the Envelope – that defines what is in the message and how to process it, a set of encoding rules for datatypes, and finally the layout of the procedure calls and responses gathered. This envelope is sent via a transport (HTTP/HTTPS), and an RPC (Remote Procedure Call) is executed and the envelope is returned with information in a XML formatted document.

It is important to note that one of the advantages of SOAP is the use of the “generic” transport. While REST today uses HTTP/HTTPS, SOAP can use almost any transport to send the request, using everything from the afore mentioned to SMTP (Simple Mail Transfer Protocol) and even JMS (Java Messaging Service). However, one perceived disadvantage is the use of XML because of the verboseness of it and the time it takes to parse.

However, the good news for web developers is that both technologies are very viable in today’s market. Both REST and SOAP can solve a huge number of web problems and challenges, and in many cases each can be made to do the developers bidding, which means they can work across the domain.

But the untold story is that both technologies can be mixed and matched. REST is very easy to understand and is extremely approachable, but does lack standards and is considered an architectural approach. In comparison, SOAP is an industry standard with a well-defined protocol and a set of well-established rules to be implemented, and it has been used in systems both big and small.

So this means areas that REST works really well for are:

  • Limited bandwidth and resources; remember the return structure is really in any format (developer defined). Plus, any browser can be used because the REST approach uses the standard GET, PUT, POST, and DELETE verbs. Again, remember that REST can also use the XMLHttpRequest object that most modern browsers support today, which adds an extra bonus of AJAX.
  • Totally stateless operations; if an operation needs to be continued, then REST is not the best approach and SOAP may fit it better. However, if you need stateless CRUD (Create, Read, Update, and Delete) operations, then REST is it.
  • Caching situations; if the information can be cached because of the totally stateless operation of the REST approach, this is perfect.

That covers a lot of solutions in the above three. So why would I even consider SOAP? Again, SOAP is fairly mature and well-defined and does come with a complete specification. The REST approach is just that, an approach and is wide open for development, so if you have the following then SOAP is a great solution:

  • Asynchronous processing and invocation; if your application needs a guaranteed level of reliability and security then SOAP 1.2 offers additional standards to ensure this type of operation. Things like WSRM – WS-Reliable Messaging.
  • Formal contracts; if both sides (provider and consumer) have to agree on the exchange format then SOAP 1.2 gives the rigid specifications for this type of interaction.
  • Stateful operations; if the application needs contextual information and conversational state management then SOAP 1.2 has the additional specification in the WS* structure to support those things (Security, Transactions, Coordination, etc). Comparatively, the REST approach would make the developers build this custom plumbing.

As shown above, each technology approach has their uses. They both have underlying issues around security, transport layers, and the like, but they both can get the job done and in many cases, they each bring something to the web. So for this argument, the best rule, is the rule of flexibility, because no matter what the problem at least in today’s web development world, web developers have great solutions using either of these protocols.

About the Author

Mike Rozlog is the senior director of products for Embarcadero Technologies, a database tools and developer software company . In this role, he is focused on ensuring the developer focused products being created by Embarcadero meet the expectations of developers around the world. Much of his time is dedicated to discussing and explaining the technical and business aspects of Embarcadero’s products and services to analysts and other audiences worldwide. Mike was formerly with CodeGear, a developer tools group that was acquired by Embarcadero in 2008. Previously, he spent more than eight years working for Borland in a number of positions, including a primary role as Chief Technical Architect. A reputed author, Mike has been published numerous times. His latest collaboration is Mastering JBuilder from John Wiley & Sons, Inc.

Wednesday, April 21, 2010

dbExpress Database Access Components in Delphi - Delphi 101

Mastering Database Application Development

Watch online video tutorials to get a quick start and useful tips.  Learn from the Expert and see how to rapidly build high-performance database applications with Delphi, C++Builder, Delphi Prism and RAD Studio. 

 

dbExpress Database Access Components in Delphi - Delphi 101

This tutorial video describes how to use the dbExpress database access components in Delphi and RAD Studio along with a demonstration of rapidly building a database application with the Firebird database.

 

About the Presenter

Mike Rozlog
Mike Rozlog
Product Manager of Delphi Solutions at Embarcadero Technologies

Mike Rozlog is the Product Manager of Delphi Solutions for Embarcadero Technologies. In this role, he is focused on ensuring the family of Delphi developer products being created by Embarcadero meets the expectations of developers around the world. Much of his time is dedicated to discussing and explaining the technical and business aspects of Embarcadero’s products and services to analysts and other audiences worldwide. Mike was formerly with CodeGear, a developer tools group that was acquired by Embarcadero in 2008. Previously, he spent more than eight years working for Borland in a number of positions, including a primary role as Chief Technical Architect. A reputed author, Mike has been published numerous times. His latest collaboration is Mastering JBuilder from John Wiley & Sons, Inc.

Tuesday, April 20, 2010

Delphi 101: Database Access Methods in Delphi

Mastering Database Application Development

Watch online video tutorials to get a quick start and useful tips.  Learn from the Expert and see how to rapidly build high-performance database applications with Delphi, C++Builder, Delphi Prism and RAD Studio. 

Delphi 101: Database Access Methods in Delphi

This tutorial video describes the database access options in Delphi and C++Builder and helps you select the best approach for your database projects. 

 

 

About the Presenter

Mike Rozlog
Mike Rozlog
Product Manager of Delphi Solutions at Embarcadero Technologies

Mike Rozlog is the Product Manager of Delphi Solutions for Embarcadero Technologies. In this role, he is focused on ensuring the family of Delphi developer products being created by Embarcadero meets the expectations of developers around the world. Much of his time is dedicated to discussing and explaining the technical and business aspects of Embarcadero’s products and services to analysts and other audiences worldwide. Mike was formerly with CodeGear, a developer tools group that was acquired by Embarcadero in 2008. Previously, he spent more than eight years working for Borland in a number of positions, including a primary role as Chief Technical Architect. A reputed author, Mike has been published numerous times. His latest collaboration is Mastering JBuilder from John Wiley & Sons, Inc.

Monday, April 19, 2010

Bill Inmon on Data Warehouse 2.0 and Modeling

Embarcadero Technologies Webinar Event

Embarcadero presents a Bill-Inmon Webinar

Learn How Data Warehouse, Metadata, and Modeling Are Being Transformed

Register for Webinar!

Data warehousing is in a constant state of evolution. From a simple data warehouse to ETL, to data marts and operational data stores, data warehousing has continued to evolve over the years. Enter DW 2.0, the architecture for the next generation of data warehousing. DW 2.0 recognizes the life cycle of data within the data warehouse and the need for unstructured data as well as the need for a formal and powerful metadata infrastructure and data modeling strategy.
Join the leading authority and “Father” of data warehousing, Bill Inmon, for an insightful webinar on modeling and metadata management strategies for Data Warehousing 2.0.
Date: Wednesday, April 21, 2010
Time: 11:00 AM PDT/ 2:00 PM EDT

In this complimentary one-hour webinar, you’ll learn:

  • How the Data Warehouse, Metadata, and Modeling environment will be transformed in the next few years—and what you need to do to leverage it for your business

  • The major components of Data Warehouse 2.0 architectures and the role of metadata and modeling

  • Key modeling and metadata management strategies for Data Warehousing 2.0

All webinar registrants will also receive a complimentary copy of Bill Inmon’s white paper, “Data Warehousing 2.0 – Modeling and Metadata Strategies for Next Generation Architectures”.
-------------------------------------------------------------------------------------------------------------
About the Presenter:
Bill Inmon
President and CTO
Forest Rim Technology, LLC

Bill Inmon pic

Bill Inmon, “the father of data warehousing”, has written 50 books translated into 9 languages. Bill founded and took public the world’s first ETL software company. Bill has written over 1000 articles and published in most major trade journals. Bill has conducted seminars and spoken at conferences on every continent except Antarctica. Bill holds three software patents. Bill’s latest company is Forest Rim Technology, a company dedicated to the access and integration of unstructured data into the structured world. Bill’s website – inmoncif.com - has attracted over 1,000,000 visitors a month. Bill’s weekly newsletter in b-eye-network.com is one of the most widely read in the industry and goes out to 75,000 subscribers each week.

Don’t miss this live event!
Date: Wednesday, April 21, 2010
Time: 11:00 AM PDT/ 2:00 PM EDT
Duration: 1 hour
Register today.



April 21, 2010



All webinar registrants will receive a complimentary copy of Bill Inmon's Whitepaper

Bill_Inmon_WP

Have a question?

Give us a call
1-888-233-2224 or
contact sales