MSDN_0111DG



Comments



Description

THE MICROSOFT JOURNAL FOR DEVELOPERSCOLUMNS EDITOR’S NOTE Change the World Keith Ward page 4 TOOLBOX Visual Studio Tools and Extensions Terrence Dorsey page 6 CUTTING EDGE Interceptors in Unity 2.0 Dino Esposito page 10 FORECAST: CLOUDY Branch-Node Synchronization with SQL Azure Joseph Fultz page 16 THE WORKING PROGRAMMER Multiparadigmatic .NET, Part 5: Automatic Metaprogramming Ted Neward page 78 UI FRONTIERS A Color Scroll for XNA Charles Petzold page 82 DON’T GET ME STARTED Turn! Turn! Turn! David Platt page 88 JANUARY 2011 VOL 26 NO 1 WORKFLOWS Scalable, Long-Running Workflows with Windows Server AppFabric Rafael Godinho . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Authoring Custom Control Flow Activities in WF 4 Leon Welicki . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 PLUS Using MEF to Expose Interfaces in Your Silverlight MVVM Apps Sandrino Di Mattia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Data Processing: Parallelism and Performance Johnson Hart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Use Multiple Visual Studio Project Types for Cloud Success Patrick Foley . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Build a Data-Driven Enterprise Web Site in 5 Minutes James Henry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 Using Quince ™ , you and your team can collaborate on the user interface using wireframes, designs and examples. Then use NetAdvantage® UI controls, like the map control used here, to bring the application to life quickly & easily. ... ............................................................................................ ....................................................................... Untitled-7 2 11/10/10 10:59 AM From start to finish, Infragistics gives you the tools to create impressive user experiences that'll make end users happy! SEE HOW WE USE THE TOOLS TO CREATE THIS KILLER APP AT INFRAGISTICS.COM/IMPRESS Infragistics Sales 800 231 8588 • Infragistics Europe Sales +44 (0) 800 298 9055 • Infragistics India +91 80 4151 8042 • @infragistics .............................................................................................................................. ... Untitled-7 3 11/10/10 10:59 AM magazine Printed in the USA LUCINDA ROWLEY Director DIEGO DAGUM Editorial Director/[email protected] KERI GRASSL Site Manager KEITH WARD Editor in Chief/[email protected] TERRENCE DORSEY Technical Editor DAVID RAMEL Features Editor WENDY GONCHAR Managing Editor KATRINA CARRASCO Associate Managing Editor SCOTT SHULTZ Creative Director JOSHUA GOULD Art Director ALAN TAO Senior Graphic Designer CONTRIBUTING EDITORS K. Scott Allen, Dino Esposito, Julie Lerman, Juval Lowy, Dr. James McCaffrey, Ted Neward, Charles Petzold, David S. Platt Henry Allain President, Redmond Media Group Matt Morollo Vice President, Publishing Doug Barney Vice President, Editorial Director Michele Imgrund Director, Marketing Tracy Cook Online Marketing Director ADVERTISING SALES: 508-532-1418/[email protected] Matt Morollo VP, Publishing Chris Kourtoglou Regional Sales Manager William Smith National Accounts Director Danna Vedder Microsoft Account Manager Jenny Hernandez-Asandas Director Print Production Serena Barnes Production Coordinator/[email protected] Neal Vitale President & Chief Executive Offcer Richard Vitale Senior Vice President & Chief Financial Offcer Michael J. Valenti Executive Vice President Abraham M. Langer Senior Vice President, Audience Development & Digital Media Christopher M. Coates Vice President, Finance & Administration Erik A. Lindgren Vice President, Information Technology & Application Development Carmel McDonagh Vice President, Attendee Marketing David F. Myers Vice President, Event Operations Jeffrey S. Klein Chairman of the Board MSDN Magazine (ISSN 1528-4859) is published monthly by 1105 Media, Inc., 9201 Oakdale Avenue, Ste. 101, Chatsworth, CA 91311. Periodicals postage paid at Chatsworth, CA 91311-9998, and at additional mailing offces. Annual subscription rates payable in US funds are: U.S. $35.00, International $60.00. Annual digital subscription rates payable in U.S. funds are: U.S. $25.00, International $25.00. Single copies/back issues: U.S. $10, all others $12. Send orders with payment to: MSDN Magazine, P.O. Box 3167, Carol Stream, IL 60132, email [email protected] or call (847) 763-9560. POSTMASTER: Send address changes to MSDN Magazine, P.O. Box 2166, Skokie, IL 60076. Canada Publications Mail Agreement No: 40612608. Return Undeliverable Canadian Addresses to Circulation Dept. or IMS/NJ. Attn: Returns, 310 Paterson Plank Road, Carlstadt, NJ 07072. Printed in the U.S.A. Reproductions in whole or part prohibited except by written permission. Mail requests to “Permissions Editor,” c/o MSDN Magazine, 16261 Laguna Canyon Road, Ste. 130, Irvine, CA 92618. Legal Disclaimer: The information in this magazine has not undergone any formal testing by 1105 Media, Inc. and is distributed without any warranty expressed or implied. Implementation or use of any information contained herein is the reader’s sole responsibility. While the information has been reviewed for accuracy, there is no guarantee that the same or similar results may be achieved in all environments. Technical inaccuracies may result from printing errors and/or new developments in the industry. Corporate Address: 1105 Media, Inc., 9201 Oakdale Ave., Ste 101, Chatsworth, CA 91311, www.1105media.com Media Kits: Direct your Media Kit requests to Matt Morollo, VP Publishing, 508-532-1418 (phone), 508-875-6622 (fax), [email protected] Reprints: For single article reprints (in minimum quantities of 250-500), e-prints, plaques and posters contact: PARS International, Phone: 212-221-9595, E-mail: [email protected], www.magreprints.com/ QuickQuote.asp List Rental: This publication’s subscriber list, as well as other lists from 1105 Media, Inc., is available for rental. For more information, please contact our list manager, Merit Direct. Phone: 914-368-1000; E-mail: [email protected]; Web: www.meritdirect.com/1105 All customer service inquiries should be sent to [email protected] or call 847-763-9560. JANUARY 2011 VOLUME 26 NUMBER 1 programmersparadi se. com Your best source for software development tools! Prices subject to change. Not responsible for typographical errors. ® programmers.com/textcontrol Download a demo today. New Service Pack! Professional Edition Paradise # T79 02101A02 $ 1,220. 99 • .NET WinForms control for VB.NET and C# • ActiveX for VB6, Delphi, VBScript/HTML, ASP • File formats DOCX, DOC, RTF, HTML, XML, TXT • PDF and PDF/A export, PDF text import • Tables, headers & footers, text frames, bullets, structured numbered lists, multiple undo/redo, sections, merge fields, columns • Ready-to-use toolbars and dialog boxes TX Text Control 15.1 Word Processing Components TX Text Control is royalty-free, robust and powerful word processing software in reusable component form. programmers.com/embarcadero Embarcadero RAD Studio XE by Embarcadero Embarcadero RAD Studio XE is a comprehensive application development suite and the fastest way to visually build GUI-intensive, data-driven applications for Windows, .NET, PHP and the Web. RAD Studio includes Delphi ® , C++Builder ® , Delphi Prism ™ , and RadPHP ™ ; providing powerful com- piled, managed and dynamic language support, heterogeneous database connectivity, rich visual component frameworks, and a vast 3rd party ecosystem – enabling you to deliver applications up to 5x faster across multiple Windows, Web, and database platforms. Paradise # CGI 15401A01 $ 1,383. 99 programmers.com/vSphere programmers.com/LEAD LEADTOOLS Document Imaging SDK v17.0 by LEAD Technologies LEADTOOLS Document Imaging has every compo- nent you need to develop powerful image-enabled business applications including specialized bi-tonal image processing, document clean up, annota- tions, high-speed scanning, advanced compression (CCITT G3/G4, JBIG2, MRC, ABC), and Win32/64 binaries for C/C++, .NET, Silverlight, WPF, WCF, & WF. Available add-ons include: • Multi-threaded OCR/ICR/OMR/MICR/ Barcodes (1D/2D) • Forms Recognition/Processing • Print Capture and Document Writers • PDF, PDF/A and XPS Paradise # L05 03301A01 $ 2,007. 99 VMware vSphere Essentials Kit Bundle vSphere Essentials provides an all-in-one solution for small offices to virtualize three physical servers for consolidating and managing applications to reduce hardware and operating costs with a low up-front investment. vSphere Essentials includes: • VMware ESXi and VMware ESX (deployment-time choice) • VMware vStorage VMFS • Four-way virtual SMP • VMware vCenter Server Agent • VMware vStorage APIs/VCB • VMware vCenter Update Manager • VMware vCenter Server for Essentials for 3 hosts Paradise # V55 85101C02 $ 446. 99 programmers.com/vmware VMware Workstation 7 VMware Workstation 7 is the gold-standard virtualization software for desktop and laptop computers, with the richest feature set and broadest platform support available. VMware Workstation enables users to create and host multiple virtual machines on a single desktop, expanding the power of desktop systems for IT administrators; software development and test engineers; technical sales, training and support staff; and PC enthusiasts. VMware Workstation transforms the way technical professionals develop, test, demo, and deploy software. Workstation’s innovative features for VMware environments help to reduce hardware cost, save time, minimize risk, and streamline tasks that save time and improve productivity. for Linux & Windows Paradise # V55 22301A04 $ 149. 99 ActiveReports 6 by GrapeCity The de facto standard reporting tool for Microsoft Visual Studio.NET • Fast and Flexible reporting engine • Flexible event-driven API to completely control the rendering of reports • Wide range of Export and Preview formats including Windows Forms Viewer, Web Viewer, Adobe Flash and PDF • XCopy deployment • Royalty-Free Licensing for Web and Windows applications Professional Ed. Paradise # D03 04301A01 $ 1,310. 99 NEW VERSION 6! programmers.com/grapecity 866- 719- 1528 programmers.com/microsoft Microsoft Visual Studio Professional 2010 Upgrade by Microsoft Upgrade to Microsoft Visual Studio 2010 Professional — the integrated environment that simplifies creating, debugging and deploying applications. Unleash your creativity and bring your vision to life with powerful design surfaces and innovative collaboration methods for devel- opers and designers. Work within a personalized environment, targeting a growing number of platforms, including Microsoft SharePoint and cloud applications and accelerate the coding process by using your existing skills. Integrated support for Test-First Development and new debugging tools let you find and fix bugs quickly and easily to ensure high quality solutions. with MSDN Single user Paradise # M47 40201B02 $ 488. 99 New Intel Visual Fortran Compiler by Intel Intel ® Visual Fortran Composer XE 2011 includes the latest generation of Intel ® Fortran compilers, Intel ® Visual Fortran Compiler XE 12.0 for Windows. Intel ® Fortran Composer XE is available for Linux and Mac OS X. This package delivers advanced capabilities for development of application parallelism and winning performance for the full range of Intel ® processor-based platforms and other compatible platforms. It includes the compiler’s breadth of advanced optimization, multithreading, and processor support, as well as automatic proces- sor dispatch, vectorization, and loop unrolling. for Windows Single (SSR) Paradise # I23 86101E03 $ 263. 99 programmers.com/intel programmers.com/microsoft Microsoft SQL Server Developer Edition 2008 R2 by Microsoft SQL Server 2008 Developer enables developers to build and test applications that run on SQL Server on 32-bit, ia64, and x64 platforms. SQL Server 2008 Developer includes all of the functionality of Enterprise Edition, but is licensed only for development, test, and demo use. The license for SQL Server 2008 Developer entitles one developer to use the software on as many systems as necessary. For rapid deployment into production, instances of SQL Server 2008 Developer can easily be upgraded to SQL Server 2008 Enterprise without reinstallation. 2-bit/x64 IA64 DVD Paradise # M47 31101A04 $ 41. 99 NEW RELEASE! programmers.com/sparxsystems Enterprise Architect Corporate Edition Visualize, Document and Control Your Software Project by Sparx Systems Enterprise Architect is a comprehensive, integrated UML 2.1 modeling suite providing key benefits at each stage of system development. Enterprise Architect 7.5 supports UML, SysML, BPMN and other open standards to analyze, design, test and construct reliable, well under- stood systems. Additional plug-ins are also available for Zachman Framework, MODAF, DoDAF and TOGAF, and to integrate with Eclipse and Visual Studio 2005/2008. 1-4 Licenses Paradise # SP6 03101A02 $ 182. 99 programmers.com/idm IDM UltraEdit The #1 Best Selling Text Editor in the World by IDM UltraEdit is the world’s standard in text editors. Millions use UltraEdit as the ideal text/hex/programmers editor on any platform — Windows, Linux, or Mac! Features include syntax highlighting for nearly any programming language; powerful Find, Replace, Find in Files, and Replace in Files; FTP support, sort, column mode, hex, macros/scripting, large file handling (4+ GB), projects, templates, Unicode, and more. Named User 1-24 Users Paradise # I84 01201A01 $ 59. 95 NEW RELEASE! Win an iPad Place an Order for Software (or Hardware) with Programmer’s Paradise and You’ll be Entered for a Drawing to Win an iPad Wi-Fi 32GB. Just Use the Offer Code TRWD01 When You Place Your Order On-line or with Your Programmer’s Paradise Representative. Untitled-2 1 12/13/10 4:30 PM msdn magazine 4 web of information nodes rather than a hierarchical tree or an ordered list is the basic concept behind HyperText.” Systems integration never sounded so simple, did it? Te key component to tie the information together is described here: “A program which provides access to the hypertext world we call a browser. When starting a hypertext browser on your workstation, you will frst be presented with a hypertext page which is personal to you: your personal notes, if you like. A hypertext page has pieces of text which refer to other texts. Such references are highlighted and can be selected with a mouse (on dumb terminals, they would appear in a numbered list and selection would be done by entering a number). When you select a reference, the browser presents you with the text which is referenced: you have made the browser follow a hypertext link. Tat text itself has links to other texts and so on.” And here, in a nutshell, is why it’s so cool: “Te texts are linked together in a way that one can go from one concept to another to fnd the information one wants. Te network of links is called a web. Te web need not be hierarchical, and there- fore it is not necessary to ‘climb up a tree’ all the way again before you can go down to a diferent but related subject.” And the rest is history. Te point of our little trip in Doc Brown’s DeLorean back to the Web’s beginning: You never know where an idea will lead. But if you don’t break through boundaries, don’t ask “What if ...?” enough, don’t aspire to make a bigger impact, you surely won’t. It’s now 2011. Te year ahead looms. Will you be content with doing the same old thing you’ve been doing for weeks, months or years? Or will you invent something? You can’t invent the Web, but you can invent a way to do it better. Can you make the world better with your sofware, even if it’s only a tiny little corner of it? In other words—are you satisfed, or are you wanting to do more than you’ve done? Stay thirsty, my friends. Change the World Te Internet, of course, has no end. But it did have a beginning. Nov. 12, 1990, was when Tim Berners-Lee detailed his plan for a “Hypertext Project” (w3.org/Proposal.html). For geeks like me, reading it can raise goosebumps. In a way, it’s like reading the U.S. Constitution—this document established a framework for a new way of doing things. Back in 1989-1990, Berners-Lee was looking for a way to more easily share information around the massive physics lab at which he worked. Te idea was actually pretty basic, but of course what came out of it is not. Here’s a pretty nice summation of their plan. How can you not get excited while reading this description? “HyperText is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. Potentially, HyperText provides a single user-interface to many large classes of stored information such as reports, notes, data-bases, computer documentation and on-line systems help.” Being, among other things, a first-class software engineer, Berners-Lee had a specifc problem in mind that he wanted to solve: “Tere is a potential large beneft from the integration of a variety of systems in a way which allows a user to follow links pointing from one piece of information to another one. Tis forming of a EDITOR’S NOTE KEITH WARD © 2011 Microsoft Corporation. All rights reserved. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, you are not permitted to reproduce, store, or introduce into a retrieval system MSDN Magazine or any part of MSDN Magazine. If you have purchased or have otherwise properly acquired a copy of MSDN Magazine in paper format, you are permitted to physically transfer this paper copy in unmodifed form. Otherwise, you are not permitted to transmit copies of MSDN Magazine (or any part of MSDN Magazine) in any form or by any means without the express written permission of Microsoft Corporation. A listing of Microsoft Corporation trademarks can be found at microsoft.com/library/toolbar/3.0/trademarks/en-us.mspx. Other trademarks or trade names mentioned herein are the property of their respective owners. MSDN Magazine is published by 1105 Media, Inc. 1105 Media, Inc. is an independent company not affliated with Microsoft Corporation. Microsoft Corporation is solely responsible for the editorial contents of this magazine. The recommendations and technical guidelines in MSDN Magazine are based on specifc environments and confgurations. These recommendations or guidelines may not apply to dissimilar confgurations. Microsoft Corporation does not make any representation or warranty, express or implied, with respect to any code or other information herein and disclaims any liability whatsoever for any use of such code or other information. MSDN Magazine, MSDN, and Microsoft logos are used by 1105 Media, Inc. under license from owner. Visit us at msdn.microsoft.com/magazine. Questions, comments or suggestions for MSDN Magazine? Send them to the editor: [email protected]. The point of our little trip in Doc Brown’s DeLorean back to the Web’s beginning: You never know where an idea will lead. Untitled-2 1 12/13/10 4:28 PM msdn magazine 6 Scott Guthrie explains how each of the features in Productivity Power Tools works on his blog, so check that out for details (bit.ly/aopeNt). PowerCommands 10.0 PowerCommands 10.0 (bit.ly/hUY9tT), like Productivity Power Tools, is a grab bag of useful extra tools that will speed up or simplify common tasks in the IDE. You get 25 features in the package; they include robust copy and paste enhancements (copying entire classes, for example). The package also includes the ability to format your code, sort using statements and remove unused using references when saving. Team Foundation Server Power Tools September 2010 Don’t feel left out if you’re using Visual Studio Team Foundation Server (TFS). Microsoft has a set of Power Tools for you, too. This extension (bit.ly/hyUNqo) gives you 11 new features that include check-in policies and item templates, a process editor, TFS command-line tools and Windows Powershell cmdlets, team member management, Windows shell integration and automated database backup. Visual Studio Color Theme Editor It may not sound as glamorous, but sometimes it’s the little details that make coding that much easier. Take the colors used in the Visual Studio windows, tabs and menus, for instance. Do brighter colors cheer your mood? Are you particularly fond of magenta? Whatever you prefer, Visual Studio Color Theme Editor (bit.ly/fPKKEV) lets you customize all of the environment colors used in the IDE. You can also save themes and share them with your friends. StudioStyles An even more personal choice is the colorization used for the code itself in your editor. StudioStyles (studiostyl.es) is a Web site that lets you download, create and share the .vssettings files that specify code colorization. Added bonus: These themes can be used with Visual Studio 2010, 2008, 2005 and even the Express versions. Visual Studio Tools and Extensions TOOLBOX TERRENCE DORSEY Because you’re reading this magazine, there’s a good chance you sling code for a living. And if you sling code for a living, you probably spend a lot of time in your IDE ... which is—because you’re reading this magazine—probably Visual Studio. Visual Studio 2010 is already an incredibly versatile coding tool. It does pretty much every- thing except write the code for you, and in many cases it’s getting good at doing that, too. Still, Visual Studio can’t do it all out of the box. That’s where extensions come to the rescue. Visual Studio 2010 provides robust support for extensibility via custom tools, templates and plug-ins. (Note, however, that the Express ver- sions of Visual Studio don’t support extensions.) If you can’t find the feature you need in Visual Studio, chances are there’s an extension that helps you customize the IDE or provides the tools you need to write code better and faster. We’ll cover a few of the most popular free extensions for Visual Studio 2010. Power Tools for Visual Studio There are thousands of extensions out there, and it just so happens that one of the most robust extensions was created by the Visual Studio team. Visual Studio 2010 Productivity Power Tools (bit.ly/g4fUGG) is a package of 15 handy features that range from Solution Navigator (think Solution Explorer on steroids) to tab autocompletion and highly configurable enhancements to tabs. Solution Navigator in Productivity Power Tools StudioStyles s0OWERFUL3ILVERLIGHTCONTROLS s#OMPLETESETOF!30.%4-6#COMPONENTS s#UTTINGEDGE"USINESS)NTELLIGENCECOMPONENTS Essential Studio Enterprise Edition: 1-888-9DOTNET | www.syncfusion.com Untitled-4 1 12/9/10 3:21 PM msdn magazine 8 Toolbox WordLight Do you ever want to quickly find all the places you’ve used a method or variable name? WordLight (code.google.com/p/ wordlight) is a simple extension for Visual Studio 2008 that lets you select some text and instantly highlights all other occurrences of that string in the code file. It also works in the Output, Command and Immediate windows. Spell Checker If y0u tpye lke I do, the Spell Checker is a lifesaver. The Spell Checker extension (bit.ly/aMrXoM) looks for errors in the non-code portions of your files. It works in any plain-text files, for comments and strings in source code, and for non-tag elements of HTML and ASP files. TortoiseSVN Add-in for Visual Studio So you’ve written and tested your code. If you’re working on a team or open source project, you probably need to commit your source to a repository. There’s a tool for that. If you’re using Apache Subversion (subversion.apache.org) source control along with a TortoiseSVN client for Windows (tortoisesvn.tigris.org), there are a couple of Visual Studio extensions that incorporate the TortoiseSVN functionality into the IDE (tsvnaddin.codeplex.com), saving you many steps in the commit process. VsTortoise When using TFS, you’ll need to add a layer such as SvnBridge (svnbridge.codeplex.com) that translates APIs between Subversion clients like TortoiseSVN (vstortoise.codeplex.com) and TFS. Another popular source-code management system is Git (git-scm.com), and if that’s your preferred repository, then there’s an extension for you, too. Git Extensions (code.google.com/p/gitextensions) includes shell extensions for Windows Explorer and a Visual Studio plug-in. Plus, you can run most features from the command line. NuGet Inspired by RubyGems and similar package-management systems from the Linux development world, NuGet (nuget.codeplex.com/) gives Microsoft .NET Framework developers the ability to easily incorporate libraries from source-code repositories directly into their local development projects. NuGet integrates with the Visual Studio 2010 IDE, and you can also run NuGet from the command line or via Windows PowerShell cmdlets. Emacs and Vim Emulation In the beginning there was vi, and it was difficult to learn. Since those early days, Emacs and Vim have battled for supremacy as the One True Editor among coders. If you’ve chosen sides in that debate, yet find yourself using Visual Studio, then rejoice! The keybindings and many other features you know and love from Emacs and Vim are now available in extensions for Visual Studio. You can follow the progress of VsVim (bit.ly/e3GsMf) developer Jared Parsons via his blog (blogs.msdn.com/b/jaredpar/). More information about Emacs emulation (bit.ly/eXhaIK), along with lots of other great tips, can be found on the Visual Studio Team blog (blogs.msdn.com/b/visualstudio/). A Gallery of Extensions This is just the tip of the iceberg as far as Visual Studio extensions are concerned. Thousands of templates, custom controls and extensions are available through the Visual Studio Gallery (visualstudiogallery.msdn.microsoft.com), and more are being added all the time. Many are free, and there are trial versions available for many of the commercial products. Write Your Own Extensions Don’t see what you need in the Visual Studio Gallery? Write your own! Visual Studio 2010 includes deep hooks for extensibility—anything from a custom project template to third-party tools that integrate directly with the IDE. Through the Extending Visual Studio developer center (msdn.microsoft.com/vstudio/ vextend), MSDN Library articles and other resources in the Visual Studio community (bit.ly/aT1bDe), you’ll find a vast amount of information to start creating custom Visual Studio extensions. You’ve already got the tools ... start coding! „ Terrence Dorsey is the technical editor of MSDN Magazine. You can read his blog at terrencedorsey.com or follow him on Twitter at @tpdorsey. NuGet If y0u tpye lke I do, the Spell Checker is a lifesaver. The Spell Checker extension (bit.ly/aMrXoM) looks for errors in the non-code portions of your files. TOOLBOX Untitled-5 1 12/14/10 11:45 AM msdn magazine 10 Interceptors in Unity ing the interception API in Unity 2.0. That quick demo made a few assump- tions, however. First, it worked on a type registered with the Unity Inversion of Control (IoC) infrastructure and instantiated via the Unity factory layer. Second, the pointcut was only defned through an interface. In AOP jargon, a pointcut represents a collection of places in the body of the target class where the framework will inject extra behaviors on demand. An interface-based point- cut means that only the members of the interface will be extended at runtime via code injection. Tird, I mostly focused on the confguration settings to enable interception and disregarded the fuent API that allows you to confgure Unity in code. In the rest of this article, I’ll explore the fuent API and alternative ways of defning Unity interceptors. Interceptable Instances To add a new behavior to an existing or freshly created instance of a class, you must take some control over the factory. In other words, AOP is not pure magic and you’ll never be able to hook up a plain CLR class instantiated via the standard new operator: var calculator = new Calculator(); The way in which an AOP framework gains control over the instance may difer quite a bit. In Unity, you can resort to some explicit calls that return a proxy for the original object or keep the whole thing running behind an IoC framework. For this reason, most IoC frameworks ofer AOP capabilities. Spring.NET and Unity are two examples. When AOP meets IoC, the result is a seamless, easy and efective coding experience. Let’s start with an example where no IoC capabilities are used. Here’s some basic code that makes an existing instance of the Calculator class interceptable: var calculator = new Calculator(); var calculatorProxy = Intercept.ThroughProxy<ICalculator>(calculator, new InterfaceInterceptor(), new[] { new LogBehavior() }); Console.WriteLine(calculatorProxy.Sum(2, 2)); You end up working with an interceptable proxy that wraps your original object. In this case, I’m assuming that the Calculator class implements the ICalculator interface. To be interceptable, a class must either implement an interface or inherit from MarshalByRef- Object. If the class derives from MarshalByRefObject, then the interceptor must be of type TransparentProxyInterceptor: In last month’s column, I briefy intro- duced the interception mechanism used in the Unity 2.0 dependency injection container. After illustrating the core principles of aspect-oriented program- ming (AOP), I presented a concrete example of interception that’s prob- ably very close to the needs of many developers today. Have you ever wanted to extend the behavior of an existing piece of code without having to touch the source code in any way? Have you just wanted to be able to run some additional code around the existing code? AOP was created to provide an approach that isolates core code from other concerns that crosscut the core business logic. Unity 2.0 ofers a Microsof .NET Framework 4-based framework for achieving this in a surprisingly quick and easy way. To fully comprehend the purpose of this follow-up article, I’ll begin by summing up what I discussed last month. As you’ll fnd out, in last month’s code I made some assumptions and used some default components. Tis month I’ll step back and discuss in more detail the choices and options that you typically encounter along the way. AOP in Unity Imagine you’ve a deployed application where, at some point, you perform some business-specifc action. One day, your customer asks to extend that behavior to perform some more work. You grab the source code, modify it, and bill a few hours of consulting time for coding up and testing the new features. But wouldn’t it be better if you could add new behaviors seamlessly and without touching the existing source code? Imagine a slightly different scenario. First, what if you’re not an independent consultant, but work full time for the company? Te more requests for change you get, the more hours you spend outside your current project and, worse yet, you risk creating new (and not necessarily required) branches of your codebase. So, you’d heartily welcome any solutions that would let you add new behaviors seamlessly and with no exposure to the source code. Finally, imagine that someone reports a bug or a serious perfor- mance hit. You need to investigate and fx the problem and you want it to be absolutely unobtrusive. In this case, too, it would be better to add new behaviors seamlessly and with no exposure to the source code. AOP helps you in all of these scenarios. Last month I demonstrated how to add pre- and post-processing code around an existing method without touching it by leverag- CUTTING EDGE DINO ESPOSITO Figure 1 Instance Interceptor and Type Interceptor Instance Interceptor Application Code Target Object Type Interceptor Application Code Actual Target Untitled-5 1 12/14/10 11:46 AM msdn magazine 12 Cutting Edge var calculator = new Calculator(); var calculatorProxy = Intercept.ThroughProxy(calculator, new TransparentProxyInterceptor(), new[] { new LogBehavior() }); Console.WriteLine(calculatorProxy.Sum(2, 2)); Te Intercept class also ofers a NewInstance method you can call to create an interceptable object in a more direct way. Here’s how to use it: var calculatorProxy = Intercept.NewInstance<Calculator>( new VirtualMethodInterceptor(), new[] { new LogBehavior() }); Note that when you use NewInstance, the interceptor compo- nent has to be slightly diferent—neither InterfaceInterceptor nor TransparentProxyInterceptor, but rather a VirtualMethodInterceptor object. So how many types of interceptors exist in Unity? Instance and Type Interceptors An interceptor is the Unity component responsible for capturing the original call to the target object and routing it through a pipe- line of behaviors so that each behavior has its chance to run before or afer the regular method call. Interception can be of two types: instance interception and type interception. Instance interceptors create a proxy to flter incoming calls directed at the intercepted instance. Type interceptors generate a new class derived from the type being intercepted, and work on an instance of that derived type. Needless to say, the delta between the original and derived type is all in the logic required to flter incoming calls. In case of instance interception, the application code frst creates the target object using a classic factory (or the new operator), and then is forced to interact with it through the proxy provided by Unity. In the case of type interception, the application creates the target object through the API or Unity, then works with that instance. (You can’t create the object directly with the new operator and get type interception.) Te target object, however, is not of the original type. Te actual type is derived by Unity on the fy and incorporates interception logic (see Figure 1). InterfaceInterceptor and TransparentProxyInterceptor are two Unity interceptors that belong to the instance interceptor category. VirtualMethodInterceptor belongs to the type interceptor category. InterfaceInterceptor can intercept public instance methods on just one interface on the target object. The interceptor can be applied to new and existing instances. TransparentProxyInterceptor can intercept public instance meth- ods on more than one interface, and marshal-by-reference objects. Tis is the slowest approach for interception, but it can intercept the widest set of methods. Te interceptor can be applied to new and existing instances. VirtualMethodInterceptor can intercept virtual methods both public and protected. Te interceptor can only be applied to new instances. It should be noted that instance interception can be applied to any public instance methods, but not to constructors. Tis is fairly obvious for the scenario in which interception applies to an existing instance. It’s a bit less obvious when interception is applied to a newly created instance. Te implementation of instance interception is such that the constructor has already executed by the time the application code gets back an object to work with. As a result, any interceptable action necessarily follows the creation of the instance. Type interception uses dynamic code generation to return an object that inherits from the original type. In doing so, any public and protected virtual methods are overridden to support interception. Consider the following code: var calculatorProxy = Intercept.NewInstance<Calculator>( new VirtualMethodInterceptor(), new[] { new LogBehavior() }); Te Calculator class looks like this: public class Calculator { public virtual Int32 Sum(Int32 x, Int32 y) { return x + y; } } Figure 2 shows the actual name of the type that results from a dynamic inspection of the calculatorProxy variable. It’s also worth noting that there are some other signifcant difer- ences between instance and type interception—for example, inter- cepting calls by the object on itself. If a method is calling another method on the same object when using type interception, then that self-call can be intercepted because the interception logic is in the same object. However, with instance interception, the interception only happens if the call goes through the proxy. Self-calls, of course, don’t go through the proxy and therefore no interception happens. Using the IoC Container In last month’s example, I used the IoC container of the Unity library to take care of object creation. An IoC container is an extra layer around object creation that adds fexibility to the application. Tis is even truer if you consider IoC frameworks with additional AOP capabilities. Furthermore—and as I see things—the level of code fex- ibility grows beyond imagination if you also combine IoC containers with of ine confguration. However, let’s start with an example that uses the Unity’s container with code-based fuent confguration: // Configure the IoC container var container = UnityStarter.Initialize(); // Start the application var calculator = container.Resolve<ICalculator>(); var result = calculator.Sum(2, 2); Any code needed to bootstrap the container can be isolated in a distinct class and invoked once at application startup. Te bootstrap code will instruct the container how to resolve types around the application and how to deal with interception. A call to the Resolve method shields you from all the details of interception. Figure 3 shows a possible implementation of the bootstrap code. The nice thing is that this code can be moved to a separate assembly and loaded or changed dynamically. More importantly, you have a single place to confgure Unity. Tis won’t happen as long as you stick to the Intercept class that behaves like a smart factory and needs to be prepared every time you use it. So if you need AOP Figure 2 Actual Type After Type Interception Untitled-5 1 12/14/10 11:44 AM msdn magazine 14 Cutting Edge in your applications, by all means get it via an IoC container. Te same solution can be implemented in an even more fexible way by moving the confguration details of to the app.confg fle (or web.confg if it’s a Web application). In this case, the bootstrap code consists of the following two lines: var container = new UnityContainer(); container.LoadConfiguration(); Figure 4 shows the script you need to have in the confguration fle. Here I registered two behaviors for the ICalculator type. Tis means that any calls to public members of the interface will be pre- and post-processed by LogBehavior and BinaryBehavior. Note that, because LogBehavior and BinaryBehavior are concrete types, you actually don’t need to register them at all. Unity’s defaults will automatically work for them. Behaviors In Unity, behaviors are objects that actually implement the crosscutting concerns. A class that implements the IInterceptionBehavior interface, a behavior rewrites the execution cycle of the intercepted method and can modify method parameters or return values. Behaviors can even stop the method from being called at all or call it multiple times. A behavior is made of three methods. Figure 5 shows a sample behavior that intercepts the method Sum and rewrites its return value as a binary string. Te method WillExecute is simply a way to optimize the proxy. If it returns false, the behavior won’t execute. Tis is actually a bit subtler. Invoke will always be called, so your behavior will in fact execute even if you returned false. However, when the proxy or derived type is being created, if all the behaviors registered for the type have WillExecute set to false, then the proxy itself won’t be created and you’ll be working with the raw object again. It’s really about optimizing proxy creation. Te GetRequiredInterfaces method allows the behavior to add new interfaces to the target object; interfaces returned from this method will be added to the proxy. Hence, the core of a behavior is the Invoke method. Te parameter input gains you access to the method being called on the target object. Te parameter getNext is the delegate for you to move to the next behavior in the pipeline and eventually execute the method on the target. Te Invoke method determines the actual logic used to execute a call to a public method on the target object. Note that all inter- cepted methods on the target object will execute according to the logic expressed in Invoke. What if you want to use more specifc matching rules? With plain interception as I described in this article, all you can do is run through a bunch of IF statements to fgure out which method is actually being invoked, like so: if(input.MethodBase.Name == "Sum") { ... } Next month I’ll resume from here to discuss more efective ways to apply interception and defne matching rules for intercepted methods. „ DINO ESPOSITO is the author of “Programming Microsoft ASP.NET MVC” (Microsof Press, 2010) and coauthored “Microsof .NET: Architecting Applications for the Enterprise” (Microsof Press, 2008). Based in Italy, Esposito is a frequent speaker at industry events worldwide. You can join his blog at weblogs.asp.net/despos. THANKS to the following technical expert for reviewing this article: Chris Tavares <unity xmlns="http://schemas.microsoft.com/practices/2010/unity"> <assembly name="SimplestWithConfigIoC"/> <namespace name="SimplestWithConfigIoC.Calc"/> <namespace name="SimplestWithConfigIoC.Behaviors"/> <sectionExtension type="Microsoft.Practices.Unity. InterceptionExtension.Configuration. InterceptionConfigurationExtension, Microsoft.Practices.Unity.Interception.Configuration" /> <container> <extension type="Interception" /> <register type="ICalculator" mapTo="Calculator"> <interceptor type="InterfaceInterceptor"/> <interceptionBehavior type="LogBehavior"/> <interceptionBehavior type="BinaryBehavior"/> </register> <register type="LogBehavior"> </register> <register type="BinaryBehavior"> </register> </container> </unity> Figure 4 Adding Interception Details Via Configuration public class BinaryBehavior : IInterceptionBehavior { public IEnumerable<Type> GetRequiredInterfaces() { return Type.EmptyTypes; } public bool WillExecute { get { return true; } } public IMethodReturn Invoke( IMethodInvocation input, GetNextInterceptionBehaviorDelegate getNext) { // Perform the operation var methodReturn = getNext().Invoke(input, getNext); // Grab the output var result = methodReturn.ReturnValue; // Transform var binaryString = ((Int32)result).ToBinaryString(); // For example, write it out Console.WriteLine("Rendering {0} as binary = {1}", result, binaryString); return methodReturn; } } Figure 5 A Sample Behavior public class UnityStarter { public static UnityContainer Initialize() { var container = new UnityContainer(); // Enable interception in the current container container.AddNewExtension<Interception>(); // Register ICalculator with the container and map it to // an actual type. In addition, specify interception details. container.RegisterType<ICalculator, Calculator>( new Interceptor<VirtualMethodInterceptor>(), new InterceptionBehavior<LogBehavior>()); return container; } } Figure 3 Bootstrapping Unity Untitled-5 1 12/14/10 11:45 AM msdn magazine 16 I think about synchronization in the following ways to get a rough idea of the level of complexity (each one is subsequently more com- plex to implement and support): 1. Push read-only data from corporate to branch and onward. 2. Two one-way pushes on diferent data; one from corporate to branch (for example, catalog) and one from branch to corporate (for example, t-logs and inventory); this includes branch to branch—the focus is on the fact that it’s basically two or more one-way syncs. 3. Bidirectional data synchronization between corporate and nodes (for example, manual product entry or employee information). 4. Peer synchronization between branches and between branches and corporate. Type 4 is by far the most complex problem and typically leads to many conficts. Terefore, I try to avoid this pattern, and the only two criteria that would force it are the need for real-time updates between nodes or the ability to sync branches if the corporate data store isn’t accessible. Because near-real-time or real-time updates among too many nodes would generally create too much traf c and isn’t typically a reasonable solution, the only criterion to which I really pay attention is the ability to sync without the master. In some cases, real-time information is needed between nodes, but this isn’t generally the case for data synchronization. Rather, it’s an event notifcation scenario, and a diferent tack is taken to address the need. Defining the Solution Architecture Generally, the most prevalent pattern that I see is to push data directly from the corporate master database (via a distributor of some type) down to servers at the branches and to mobile users. Te distribution to workstations, point-of-sale (POS) terminals and other such devices is typically done from the server at the branch location (commonly called “back-of-house servers”), whereas the Branch-Node Synchronization with SQL Azure In my years prior to joining Microsof and for the frst few years thereafer, I was heavily involved in the retail industry. During this time, I found it almost humorous to see how many times the branch- node synchronization problem gets resolved as technologies advance. In my current role, I’ve had a fair amount of exposure to the oil and gas (O&G) industry, and I’ve found that it has a similar prob- lem of synchronizing data between nodes. Much like retail chains, O&G companies have a wide range of devices and connectivity challenges. From a latent satellite connection on an offshore oil platform to an engineer in an oil feld, the requirement for timely, accurate data doesn’t change. So, with both retail and O&G in mind, I’m taking a look at the challenge again, but this time with a little help from SQL Azure and the Sync Framework. I’ll discuss how the cloud will help solve the problem of moving data between the datacenter (corporate), the branches (for example, store, rig, hub and so on) and to individual devices (handheld, shared terminal, specifc equipment and more). Tis month, I’ll focus a little more on the general architecture and a little less on the implementation. I’ll still give a few code examples for setting up synchronization between nodes and SQL Azure and fltering content as a way to reduce traf c and time required for synchronization. In next month’s column, I’ll examine using a service-based synchronization approach to provide a scalable synchronization solution beyond splitting data across SQL Azure databases based on content or geographic distribution. While the core problem hasn’t changed, what have changed are the additional requirements that get added to the mix as technology becomes more advanced. Instead of solving the simple problem of moving data between nodes, we start adding things we’d like to have, such as increasing the data volume to get more detail, inclusion of various devices for collecting and displaying data, and closer-to-real-time feeds. Let’s face it, the more we have, the more we want. In most cases, solving the data fow problems from a decade ago would be simple, but in today’s world that solution would only represent the substrate of a more robust solution. For retail, the data fow can be pretty easy—taking the form of pushing catalog-type data (menu, ware- house and so on) down and t-logs back up—to quite complex, by frequently updating inventory levels, real-time loss-prevention analysis, and manual product entries from the branch to corporate and between branches. For the most part, O&G companies have the same patterns, but they have some added complexities related to the operation, evaluation and adjustment of equipment in use. FORECAST: CLOUDY JOSEPH FULTZ In some cases, real-time information is needed between nodes, but this isn’t generally the case for data synchronization. 1110msdn_GrapeCity_Insert.indd 1 10/6/10 11:09 AM 1110msdn_GrapeCity_Insert.indd 2 10/6/10 11:10 AM 17 January 2011 msdnmagazine.com synchronization to mobile users (for example, laptops) is done directly from corporate to the machine via a client-initiated syn- chronization process (see Figure 1). Some organizations do this via the built-in replication features of the relational database management system (RDBMS), while others build processes to handle the distribution and collection of data. I’ll maintain the pattern, but use an instance of SQL Azure in place of the distributor; in place of replication, I’ll use the Sync Framework, which supports SQL Azure. Tus, I simply add a layer between the distributor and the branches (see Figure 2). What do I get by inserting SQL Azure? Some of the benefts in a branch-node scenario are: 1. Scaling the data service without having to grow the datacenter footprint. 2. High availability of the data without additional cost and efort. 3. Potentially less security sensitivity, because it isn’t the master data store. Consider that in the frst scenario, if the corporate connection or data store is down, all of the clients will have to hold on to their transactions. This could easily lead to lost data due to losing the device while waiting for the connection or due to simply running out of space on the device to store the transac- tions, as mentioned earlier. Additionally, if the branches have common data (for example, warehouse inventory data), they’ll be working of of old data until corporate is back up. While there’s no perfect solution, SQL Azure addresses this scenario by automatically making copies of the data and providing automatic failover. Also, by segmenting the fow of data through multiple SQL Azure databases, I can reduce the risk of being down and reduce load exposure further by using not only a separate instance but also diferent datacenters. As a point of design, I must consider the impact of initiating synchronization from the branches or from the server. If the application is designed to sync from the master or distributor to the nodes, I enjoy the advantage of fewer points of management and support; on the down- side, it puts some technical strains on the implementation, requiring: 1. Knowing the endpoints. 2. Knowing the appropriate scopes for each target. 3. Complexity in the synchronization process in order to make the synchronization of multiple nodes happen in parallel; the API semantics are really one pair of endpoints and one scope at a time. By initiating synchronization from the target (node or branch, for example) the complexity is diminished for the synchronization code, as it: • Can focus on the scope(s) for the application/device. • More easily handles being occasionally connected. • Only has to know and manage a few endpoints by which the distribution data is synchronized. However, it will put a bit more complexity into the applications on the target device and could complicate support and maintenance by potentially having to debug the sync process or agent at each device. Ideally, if data must be synchronized for diferent applications, a separate process should be created that manages the synchroni- zation based on a confguration fle where scope, frequency and connection strings are defned for a sync agent to run. Tis sync agent would exist externally to the applications that are the data con- sumers on the devices, although the process would provide a means for a given application to initiate syn- chronization of its data. Tis gives the beneft of initiating synchroni- zation from the node, but it also reduces the support and mainte- nance aspect, because it’s rolled up into a single process. Using the Sync Framework, I tend to start with a mixed model of synchronization by initiating from the master data store to SQL Azure and subsequently initiating from the nodes to sync between Figure 2 Base Architecture Using SQL Azure and the Sync Framework Corp Data Distributor SQL Azure Sync Agent Workstations and POS Back-of-House/ Branch Servers Mobile Users Figure 1 Typical Architecture for Data Distribution Corp Data Distributor Workstations and POS Back-of-House Servers Mobile Users Untitled-6 2 12/14/10 11:43 AM Untitled-6 3 12/14/10 11:43 AM msdn magazine 20 Forecast: Cloudy the node and SQL Azure. Restated, one might say that data is pushed from master and pulled from the branches, with SQL Azure becoming the highly available central hub between the master and branches. Based on the needs and constraints of the solution being considered, I think about the costs and benefts of moving the synchronization process control from one point in the chain to another (for example, device to cloud or corporate to cloud). Just a few of these considerations are questions such as: • Is there a place to host the process for master? • Are there security policies that confict with hosting the sync process in SQL Azure? • How many nodes at each level are synchronizing? • Can the target device realistically support a sync process? • What’s the requirement with regard to timeliness of data sync? What’s more, each of these questions has multiple layers that need to be considered and against which possible solution designs must be vetted. While there isn’t a one-for-all design, I like to start with the model described earlier and either synchronize multiple one-way syncs to accomplish something similar to bidirectional data sync, or use bidirectional synchronization between the device/corporate database and SQL Azure. After which, I look for scenarios that invalidate and force a modifcation to the design. Generally, the only synchronization style I attempt to avoid is peer-to-peer. Setting up Synchronization There are two methods to set up synchronization using Sync Framework 2.1: sync client in the cloud and sync client on the local SqlConnection azureConn = new SqlConnection(AzureConnectionString); SqlConnection onPremiseConn = new SqlConnection(LocalConnectionString); // List of columns to include Collection<string> columnsToInclude = new Collection<string>(); columnsToInclude.Add("au_id"); columnsToInclude.Add("au_lname"); columnsToInclude.Add("au_fname"); columnsToInclude.Add("phone"); columnsToInclude.Add("address"); columnsToInclude.Add("city"); columnsToInclude.Add("state"); columnsToInclude.Add("zip"); columnsToInclude.Add("contact"); // Definition for authors from local DB DbSyncTableDescription authorsDescription = SqlSyncDescriptionBuilder.GetDescriptionForTable("authors", columnsToInclude, onPremiseConn); // Create a scope and add tables to it DbSyncScopeDescription authorScopeDesc = new DbSyncScopeDescription(ScopeName); // Add the authors table to the sync scope authorsScopeDesc.Tables.Add(authorsDescription); Figure 3 Creating a Synchronization Scope SQL Azure Data Sync is a cloud-based service hosted in Windows Azure that enables synchronization of entire databases or specific tables between SQL Server and SQL Azure. At the Microsoft Professional Developers Conference 2010, we announced an update to this service called SQL Azure Data Sync Community Technology Preview (CTP) 2. This update provides organizations the ability to easily extend on-premises SQL Server databases to the cloud, allowing for phased migrations of applications to the cloud. Solutions leveraging SQL Azure Data Sync will allow users to continue to access local data and have changes seamlessly synchronized to SQL Azure as they occur. Similarly, any changes made by applications to SQL Azure are synchronized back to the on-premises SQL Server. Keeping Data in Sync SQL Azure Data Sync provides a central cloud-based management system for all synchronization relationships. From any browser, administrators can connect to the public service and manage and monitor the various database endpoints. In addition, SQL Azure Data Sync provides a scheduling service that allows synchroniza- tion to take place as often as every five minutes, or less frequently if the preference is to run synchronization at off-peak times. In the recent SQL Azure Data Sync CTP 2 update, we also introduced a new component called the SQL Azure Data Sync Agent. This agent is a Windows Service that’s installed on-premises and links the local SQL Server databases to SQL Azure Data Sync through a secure outbound HTTPS connection, meaning there are no requirements from a firewall or security- configuration standpoint—which makes setup a snap. The Agent’s job is to monitor and log tasks as well as to initiate synchronization requests from the SQL Azure Data Sync. New Scenarios With SQL Azure Data Sync, synchronization between SQL Server and SQL Azure databases provides for a wealth of new scenarios that, in the past, were quite difficult to build. Imagine you wanted to share data with your branch offices or retail store databases. With SQL Azure Data Sync, this is easy because administrators create “Sync Groups” that define data to be shared among data- bases. These Sync Groups could contain a corporate SQL Server that synchronizes data to a centralized SQL Azure “Data Hub.” Then, from this Data Hub, all of the remote or regional SQL Server databases can synchronize data changes, enabling them to bring data closer to the users, while also greatly reducing bandwidth and requirements for Virtual Private Networks, or VPNs. In addition, the ability to synchronize across multiple SQL Azure datacenters makes it easy to scale out loads across geographies. Imagine you have quarterly reporting requirements that put a huge cyclical load on your SQL Server database. Why not synchronize some of that data to your SQL Azure databases around the world when needed? Then users could access the data closest to them while reducing scalability requirements on your local SQL Server. For more information and to register for participation in the CTP 2, please visit microsoft.com/en-us/sqlazure/datasync.aspx. —Liam Cavanagh, Senior Program Manager, SQL Azure Data Sync SQL Azure Data Sync SQL Azure becomes the highly available central hub between the master and branches. Untitled-1 1 10/4/10 11:54 AM msdn magazine 22 Forecast: Cloudy machine. I’ll focus on the latter for the moment. At its simplest, here are the steps to set up a synchronization relationship: 1. Identify the data to be synchronized and the direction of data fow. Tis will be used to defne the scopes (SqlSync- ScopeProvisioning) used to synchronize the data. 2. Download and install the Sync Framework 2.1 (bit.ly/gKQODZ). Note: If x64 is the target platform, a build target for x64 will need to be added or the SyncOrchestrator won’t be able to resolve its dependencies. 3. Provision the databases and tables for synchronization; the entire database could be provisioned, or only specifc tables, or it can be limited to given columns. 4. Add necessary flters. In the case that it’s desirable to horizontally partition or otherwise flter the data, flters may be used. 5. Create and run the process to synchronize. I’ll be rather specifc in this example, as it helps convey the message, and I’ll start with databases in place at both ends. I create a connec- tion to the local database and retrieve a defnition (DbSyncTable- Description) for the table to be synchronized and add that table to the scope (DbSyncScopeDescription). In addition, I’ll specify the particular columns, but this isn’t necessary if the desire is to simply synchronize the entire table. Limiting the sync relationship to specifc columns is a good way to optimize bandwidth usage and speed up processes (see Figure 3). For each structure that needs to be synchronized, a bit of code will need to be added to get the descrip- tion; you must subsequently add it to the scope. Te next step is to grab a scope-provisioning object and use it to provision each data- base if the scope doesn’t already exist in that database, as shown in Figure 4. Because this is the frst time that the scope has been provisioned in the database, there will be some new tables for storing scope infor- mation and also a table specifcally for tracking the Authors scope that was provisioned in the databases. A good example of a console app to provision or sync a local and SQL Azure database can be found on the Sync Framework Team Blog at bit.ly/dCt6T0. Synchronizing the Data Once the databases are properly provisioned, it’s pretty simple to get them synchronizing. It requires the creation of a SqlSyncProvider for each end of the activity with the scope specifed. Tis involves the use of the SyncOrchestrator object, which is the magic behind the curtains that identifies the changes and moves the changes between them. Tat code looks something like this: SqlConnection LocalConnection = new SqlConnection(LocalConnectionString); SqlConnection AzureConnection = new SqlConnection(AzureConnectionString); SqlSyncProvider LocalProvider = new SqlSyncProvider(ScopeName, LocalConnection); SqlSyncProvider AzureProvider = new SqlSyncProvider(ScopeName, AzureConnection); SyncOrchestrator orch= new SynOrchestrator(); orch.LocalProvider = new SqlSyncProvider(ScopeName, LocalConnection); orch.RemoteProvider = new SqlSyncProvder(ScopeName, AzureConnection); orch.Direction = SyncDirectionOrder.DownloadAndUpload; orch.Synchronize(); Data and Geographic Dispersion With simple replication of data handled, I can focus on optimiz- ing the deployment architecture and data flow. Using the Sync Framework, I can specify filters; this in combination with SQL Azure can really be a huge beneft in branch-node architectures. Using the combination of the two, I can put the data closer to the ultimate consumer and optimize the bandwidth usage (and hence charges) by only synchronizing the data that matters for that // Create a provisioning object for "customers" and // apply it to the on-premises database SqlSyncScopeProvisioning onPremScopeConfig = new SqlSyncScopeProvisioning(onPremiseConn, authorsScopeDesc); if (!(onPremScopeConfig.ScopeExists(authorsScopeDesc.ScopeName))) { onPremScopeConfig.Apply(): } // Provision the SQL Azure database from the on-premises SQL Server database SqlSyncScopeProvisioning azureScopeConfig = new SqlSyncScopeProvisioning(azureConn, authorsScopeDesc); if (!(azureScopeConfig.ScopeExists(authorsScopeDesc.ScopeName))) { azureScopeConfig.Apply(); } Figure 4 Provisioning Scope Figure 5 Synchronizing Data with Filters Corp Data Distributor SQL Azure (U.S.) SQL Azure (Europe) SQL Azure (Asia) Sync Agent U.S. Filter Europe Filter Asia Filter Workstations and POS Back-of-House/ Branch Servers Mobile Users T a r g e t - B a s e d F i l t e r T a r g e t - B a s e d F i l t e r By segmenting the flow of data through multiple SQL Azure databases, I can reduce the risk of being down. msdnmagazine.com region or data segmentation. Instead of using data servers in various geographical areas, data can simply be synchronized to an instance of SQL Azure in that geographical area, and those clients in that area can synchronize to it. By spreading the data around geographically and implementing scopes that make sense for synchronizing particular data with a particular frequency, one can achieve fne-grained control over what, how, when and how much data fows across the wire, improving the user experience as it relates to data availability and freshness. Additionally, for end users who might travel between locations where it would be nice to be location-aware, the sync agent could locate itself and reach out to sync data specifcally for the current location. A couple of examples of this are current stats or alerts for workers walking into a manufacturing/plant environment and current-day sales for regional managers of retail chains (see Figure 5). Enabling fltering is no harder than provisioning a synchroniza- tion scope. Tus, there could be multiple scopes in existence that have diferent flters—or have none. Te needed change is simply to add two lines of code for each flter that’s being added: one line to add a flter column to the table and a second to add the flter clause, which is basically a “where” condition. For my sample, I’m adding a flter based on state and synchronizing only changes for the state of Utah, or UT, like so: onPremScopeConfig.Tables["authors"].AddFilterColumn("state"); onPremScopeConfig.Tables["authors"].FilterClause = "[authors].[state] = 'UT'"; If I want it to synchronize in both directions based on the flter, it will need to be added to both scopes as they’re provisioned on each end. Go Forth and Spread the Data Adding SQL Azure to the mix, whether a single instance or multiple ones, can really enhance the data availability and overall performance when synchronizing to nodes by adding that ever- important layer of indirection. Because it’s SQL Azure, one gets the performance, scalability and reliability without all of the headaches of designing, provisioning and managing the infrastructure. Look for next month’s column, where I’ll expand on the implementa- tion and show how Windows Azure can be added into the mix for synchronization using the latest Sync Framework 4.0 CTP released in October (bit.ly/dpyMP8). „ JOSEPH FULTZ is an architect at the Microsof Technology Center in Dallas, where he works with both enterprise customers and ISVs designing and prototyping sofware solutions to meet business and market demands. He’s spoken at events such as Tech·Ed and similar internal training events. THANKS to the following technical expert for reviewing this article: David Browne Once the databases are properly provisioned, it’s pretty simple to get them synchronizing. msdn magazine 24 WORKFLOW SERVI CES Scalable, Long-Running Workfl ows with Windows Server AppFabric Business processes can cover a wide variety of applica- tion scenarios. Tey can include human workfows, business logic exposed through services, coordination of presentation layers and even application integration. Although those scenarios are diferent, successful business pro- cesses have a few things in common. Tey need to be simple to build, use and modify. They need to be scalable to meet the changing needs of the business. And, ofen, they need some form of logging for status, compliance and debugging. Workfows are a good example of business processes that have been codifed into applications. Tey embody all of those elements Rafael Godinho I mentioned: human business needs, business logic, coordination between people and applications, and the ability to easily enter data and retrieve status. Tat’s a lot for an application to do, and a lot to code, too. Fortunately, the Microsof .NET Framework and Windows Server AppFabric provide the tools you need to to create, deploy and confg- ure trackable, long-running workfow services. You’re probably already familiar with the .NET Framework. Windows Server AppFabric is a set of extensions for Windows Server that includes caching and hosting services for services based on Windows Communication Foundation (WCF) and Windows Workfow Foundation (WF). In this article I’ll walk you through the process of building a simple scalable workfow service using WCF, WF and Windows Server AppFabric. Creating a Workflow Service To create a workfow service you need to combine two technologies: WCF and WF. Tis integration is seamless to the developer and is done using specifc messaging activities to receive WCF messages in the workflow. The workflow is hosted in a workflow-specific WCF ServiceHost (the WorkfowServiceHost), which exposes WCF endpoints for these messages.. Among the messaging activities group, two of them can be used to receive information, allowing the workfow to accept messages from external clients as Web service calls: the Receive activity and the ReceiveAndSendReply template. This article discusses: • Creating a workflow service • Simplifying workflow persistence • Workflow tracking • Scaling the workflow service Technologies discussed: Windows Server AppFabric, Windows Communication Foundation, Windows Workflow Foundation Code download available at: code.msdn.microsoft.com/mag201101AppFabric 25 January 2011 msdnmagazine.com A Receive activity is used to receive infor- mation to be processed by the workfow. It can receive almost any kind of data, like built-in data types, application-defned classes or even XML- serializable types. Figure 1 shows an example of a Receive activity on the workfow designer. This type of activity has many properties, but four of them are extremely important to remember: • CanCreateInstance is used to determine if the workfow runtime must create a new workfow instance to process the incoming message, or if it will reuse an existing one using correlation techniques. I’ll discuss correlation in more detail later. You’ll probably want to set it to true on the frst Receive activity of the workfow. • OperationName specifies the service operation name implemented by this Receive activity. • Content indicates the data to be received by the service. This is much like WCF service operation contract parameters. • ServiceContractName is used to create service contracts grouping service operations inside the generated Web Services Description Language (WSDL). If used alone, the Receive activity implements a one-way message-exchange pattern, which is used to receive information from clients, but does not send them a reply. Tis kind of activity can also be used to implement a request-response pattern by associating it with a SendReply activity. To help implement the request-response pattern, WF adds an option to the Visual Studio toolbox called ReceiveAndSendReply. When dropped on the workfow designer, it automatically creates a pair of pre-confgured Receive and SendReplyToReceive activities within a Sequence activity (see Figure 2). Te idea behind the ReceiveAndSendReply template is to do some processing between the Receive and SendReplyToReceive actions. However, it’s important to notice that persistence is not allowed between the Receive and SendReplyToReceive pair. A no-persist zone is created and lasts until both activities have completed, meaning if the workfow instance becomes idle, it won’t persist even if the host is confgured to persist workfows when they become idle. If an activity attempts to explicitly persist the workfow instance in the no-persist zone, a fatal exception is thrown, the workfow aborts and an exception is returned to the caller. Correlating Calls Sometimes a business process can receive more than one external call. When that happens, a new workflow instance is created at the frst call, its activities are executed and the workfow stays idle, waiting for subsequent calls. When a later call is made, the work- fow instance leaves the idle state and continues to be executed. In this way, the workflow runtime must have a way to use information received on later calls and distinguish between the previously created workfow instances to con- tinue processing. Otherwise, it could call any instance, leaving the whole process consistency at risk. Tis is called correlation—you correlate subsequent calls to the pending workfow with which the call is associated. A correlation is represented as an XPath query to identify particular data in a specifc message. It can be initialized using an Initialize- Correlation activity or by adding a value to the CorrelationInitializers, a property of some activities, such as: Receive, SendReply, Send and ReceiveReply. This initialization process can be done in code or using the workflow designer from Visual Studio 2010. Because Visual Studio has a wizard to help create the XPath query, it’s the easier—and probably the preferable— way for most developers. A possible scenario to use correlation is an expense report workfow. First, an employee submits the expense report data. Later, his manager can review the report and approve or deny the expenses (see Figure 3). In this scenario the correlation is created when the workfow returns the response to the employee client application. To create a correlation you need some context-identifying information, like the expense report ID (which is probably a unique ID already). Ten the workfow instance becomes idle, waiting for the manager to approve or deny the expense report. When the approval call is made by the manager client application, the workfow runtime correlates the received expense report ID with the previously created workfow instance to continue the process. To create a correlation in Visual Studio 2010, frst select in the workfow designer the activity where the correlation is going to Figure 1 A Receive Activity on the Workflow Designer Figure 2 ReceiveAndSendReply on the Workflow Designer Figure 3 Expense Report Sample Scenario Submit Expense Report Employee Approve/Deny Expense Report Approve Expense Report Deny Expense Report Manager msdn magazine 26 Workflow Services be initialized. In my example, this is the activity that returns the expense report ID to the client. In the SendReply activity, I set the CorrelationInitializers property in the Properties window by click- ing the ellipsis button. Tis displays the Add Correlation Initializers dialog box (see Figure 4) where you can confgure the correlation. Tree items must be set: the correlation handle, the correlation type and the XPath Queries. Te correlation handle is a variable the workflow runtime uses to store the correlation data and is automatically created by Visual Studio. Te next step is to set the correlation type. Te .NET Framework has some types of correlation, but because I need to query part of the information exchanged with the client—in other words, a content-based correlation—my best option is to use the Query correlation initializer. After doing that, the XPath queries can be set to the expense report ID. When I click the arrow, Visual Studio checks the message content and shows me a list to select the appropriate information. To continue the workfow afer the expense approval is made, the correlation must be used by the corresponding Receive activity. Tis is done by setting the CorrelatesOn property. Just click the ellipsis button near the property in the Properties window to open the Correlates On Defnition dialog box (see Figure 5). From this dialog, the Correlates- With property needs to be set to the same handle used to initialize the correlation for the SendReplyToReceive activity, and the XPath Queries property must be set to the same key and expense report ID received on the expense report approval message. WF comes with a set of general-purpose activities called Base Activity Library (BAL), some of which I’ve used to send and receive information here. Tough they are useful, sometimes activities more related to business rules are needed. Based on the scenario I’ve discussed so far, there are three activities needed for submitting and approving expense reports: Create, Approve and Deny expense report. Because all of those activities are pretty similar, I’m only going to show the code of CreateExpenseReportActivity: public sealed class CreateExpenseReportActivity : CodeActivity<int> { public InArgument<decimal> Amount { get; set; } public InArgument<string> Description { get; set; } protected override int Execute(CodeActivityContext context) { Data.ExpenseReportManager expenseReportManager = new Data.ExpenseReportManager(); return expenseReportManager.CreateExpenseReport( Amount.Get(context), Description.Get(context)); } } Te activity receives the expense amount and description, both declared as InArgument. Most of the heavy lifing is done in the Execute method. It accesses a class that uses the Entity Framework to handle database access and to save the expense report information, and on the other end the Entity Framework returns the expense report ID. Because I only need to execute CLR code and don’t need to interact with the WF runtime, the easiest option to create an activity is to inherit from CodeActivity. Te complete work- fow can be seen in Figure 6. Hosting the Workflow Service Afer the workfow service is created, you need to decide where it will run. Te traditional choice has been to run it on your own hosting environment, IIS or Windows Process Activation Services (WAS). Another option, however, is to take advantage of Windows Server AppFabric, an enhancement to the Application Server role in Windows Server 2008 R2 for hosting, managing, securing and scaling services created with WCF or WF. You can also employ Windows Server AppFabric on PCs running Windows Vista or Windows 7 for development and testing. Tough IIS and WAS already support service hosting, Windows Server AppFabric ofers a more useful and manageable environ- ment that integrates WCF and WF features such as persistence and tracking with IIS Manager. Simplified Workflow Persistence Computers still have a limited set of resources to process all of your business processes, and there’s no reason to waste computing resources on idle workflows. For long-running processes, you may have no control over the total amount of time from the beginning of the process to its end. It can take minutes, hours, days or even longer, and if it depends on external entities, such as other systems or end users, most of the time it can be idle simply waiting for a response. WF provides a persistence framework capable of storing a durable capture of a workfow instance’s state—independent of pro- cess or computer information—into instance stores. WF 4 already has a SQL Server instance store to be used out of the box. However, because WF is very extensible, I could create my own instance store to persist the workfow instance state if I wanted to. Once the work- Figure 5 CorrelatesOn Definition Figure 4 Setting the XPath Query Correlation 27 January 2011 msdnmagazine.com fow instance is idle and has been persisted, it can be unloaded to preserve memory and CPU resources, or eventually it could be moved from one node to another in a server farm. Windows Server AppFabric has an easy way to set up and main- tain integration with WF persistence features. Te whole process is transparent to the workfow runtime, which delegates the persistence tasks to AppFabric, extending the default WF persistence framework. Te frst step to confgure persistence is to set up the SQL Server database using the Windows Server AppFabric Configuration Wizard or Windows PowerShell cmdlets. Te wizard can create the persistence database if it doesn’t exist, or just create the AppFabric schema. With the database already created, all the other steps are accomplished with IIS Manager. In IIS Manager, right-click the node you want to confgure (server, Web site or application) and choose Manage WCF and WF Services | Confgure to open the Confgure WCF and WF for Application dialog, then click Workfow Persistence (see Figure 7). You can see that you have the option to enable or disable workfow persistence. You also have the option to set how long the workfow runtime will take to unload the workflow instance from memory and persist it on the database when the workfow becomes idle. Te default value is 60 seconds. If you set the value to zero it will be persisted immediately. Tis is especially important for scaling out via a load balancer. Workflow Tracking Sometimes something can go wrong with processes that interact with external users and applications. Due to the detached nature of long-running processes, it can be even worse on those scenarios. When a problem occurs, as a developer you usually need to analyze a bunch of logs to discover what happened, how to reproduce it and, most important, how to correct the problem and keep the system up. If you Figure 6 Complete Expense Report Workflow Figure 7 Configuring Workflow Persistence Figure 8 Enabling Tracking on Windows Server AppFabric msdn magazine 28 Workflow Services use WF, you already get this kind of logging built into the framework. The same way WF has an extensible framework to persist idle instances, it also has an exten- sible framework to provide vis- ibility into workflow execution. This framework is called tracking, which transparently instruments a workflow, recording key events during its execution. Windows Server App Fabric uses this ex- tensibility to improve the built-in WF tracking functionality, record- ing execution events on a SQL Server database. Te Windows Server AppFabric tracking confguration is similar to that used for persistence and can be accessed via either the Windows Server AppFabric Confguration Wizard or Windows PowerShell cmdlets. In the Confgure WCF and WF for Application dialog discussed earlier, click Monitoring. Now you can choose to enable or disable the tracking and also the track- ing level, as shown in Figure 8. While confguring tracking in Windows Server AppFabric, you can choose fve monitoring levels: • Of has the same efect as disabling monitoring and is best used in scenarios that need minimal tracking overhead. • Error Only gives visibility to only critical events like errors and warnings. This mode is best for high-performance scenarios that need only minimal error logging. • Health Monitoring is the default monitoring level and contains all the data captured at the Errors Only level, plus some additional processing data. • End-to-End Monitoring contains all data from level Health Monitoring plus additional information to reconstruct the entire message fow. Tis is used in scenarios where a service calls another service. • Troubleshooting, as the name suggests, is the most verbose level and is useful in scenarios where an application is in an unhealthy state and needs to be fxed. Scaling the Workflow Service Because Windows Server AppFabric extends the Application Server Role from Windows Server, it inherits the highly scalable infrastructure from its predecessor and can be run on a server farm behind a network load balancer (NLB). You’ve also seen that it has the ability to persist and track workfow instances when needed. As a result, Windows Server AppFabric is an excellent choice to host long-running workfow processes and support a great number of requests from clients. An exam ple of a workfow service scalable environment can be seen in Figure 9. It has two Windows Server AppFabric instances, both running copies of the same workfow defnition. NLB routes requests to the available AppFabric instance. On the expense report scenario, when a client first accesses the service to create an expense report, the balancer redirects the requests to an available Windows Server AppFabric instance, which saves the expense data in the database, returns the generated ID to the client and, because the workfow becomes idle waiting for the expense approval from the workfow runtime, persists the running instance in the database. Later, when the client application accesses the service to approve or deny the expense report, the NLB redirects the request to an avail- able Windows Server AppFabric instance (it can be a diferent server from the frst service call), and the server correlates the request and restores the workfow instance from the persistence database. Now, the instance in memory continues processing, saves the approval on the database and returns to the client when it’s done. Closing Notes As you’ve seen, the use of workflow services with correlation, persistence and tracking on a load-balancing environment is a powerful technique for running those services in a scalable man- ner. The combination of these features can increase operations productivity, allowing proactive actions on running services, and spreading workfows across threads, processes and even machines. Tis allows developers to create a fully scalable solution that’s ready to run on a single machine—or even large server farms—with no worries about infrastructure complexity. For further information about designing workfows with WCF and WF, be sure to read Leon Welicki’s article, “Visual Design of Workfows with WCF and WF 4,” from the May 2010 issue of MSDN Magazine (msdn.microsoft.com/magazine/ff646977). And for a deeper discussion of long- running processes and workfow persistence, see Michael Kennedy’s article, “Web Apps Tat Support Long-Running Operations,” from the January 2009 issue (msdn.microsoft.com/magazine/dd296718). For details about Windows Server AppFabric, see the Windows Server Developer Center at msdn.microsoft.com/windowsserver/ee695849. „ RAFAEL GODINHO is an ISV developer evangelist at Microsof Brazil helping local partners adopt Microsof technology. You can contact Godinho through his blog at blogs.msdn.com/rafaelgodinho. THANKS to the following technical experts for reviewing this article: Dave Clife, Ron Jacobs and Leon Welicki Figure 9 Workflows in a Scalable Environment Client Client Client W C F WF P e r s i s t e n c e N L B Windows Server AppFabric Persistence Database Microsoft SQL Server W C F WF P e r s i s t e n c e Windows Server AppFabric Free 60 Day Evaluation! www.leadtools.com/msdn (800) 637-1840 ‡Silverlight: 100% pure Silverlight 3, 4 and Windows Phone Imaging SDK. ‡Image Formats & Compression: Supports 150+ image formats and compressions including TIFF, EXIF, PDF, JPEG2000, JBIG2 and CCITT G3/G4. ‡Scanning: TWAIN & WIA (32 & 64-bit), auto-detect optimum driver settings for high speed scanning. ‡Viewer Controls: Win Forms, Web Forms, WPF, Silverlight, ActiveX and COM. ‡Image Processing: 200+ ¿lters, transforms, color conversion and drawing functions supporting region of interest and extended grayscale data. ‡Document Cleanup/Preprocessing: Auto-deskew, despeckle, hole punch, line and border removal, inverted text correction and more for optimum results in OCR and Barcode recognition. ‡Barcode: Auto-detect, read and write 1D and 2D barcodes for multithreaded 32 & 64 bit development. ‡OCR/ICR/OMR: Full page or zonal recognition for multithreaded 32 and 64 bit development with PDF, PDF/A, XPS, DOC, XML and Text output. ‡Forms Recognition & Processing: Automatically identify and classify forms and extract user ¿lled data. ‡PDF & PDF/A: Read, write and view searchable PDF with text, images, bookmarks and annotations. ‡Annotations: Interactive UI for document mark-up, redaction and image measurement (including support for DICOM annotations). ‡DICOM: Full support for all IOD classes and modalities de¿ned in the DICOM standard (including Encapsulated PDF/CDA and Raw Data). ‡PACS: Full support for DICOM messaging and secure communication enabling quick implementation of any DICOM SCU and SCP services. ‡Medical Image Viewer: High level display control with built-in tools for image mark-up, window level, measurement, zoom/pan, cine, and LUT manipulation. ‡Medical Web Viewer Framework: Plug-in enabled framework to quickly build high-quality, full-featured, web-based medical image delivery and viewer applications. ‡Medical Workstation Framework: Set of .NET medical and PACS components that can be used to build a full featured PACS Workstation application. ‡3D: Construct 3D volumes from 2D DICOM medical images and visualize with a variety of methods including MIP, MinIP, MRP, VRT and SSD. ‡Multimedia: Capture, play, stream and convert MPEG, AVI, WMV, MP4, MP3, OGG, ISO, DVD and more. Stream from RTSP Servers. ‡DVD: Play, create, convert and burn DVD images. ‡DVR: Pause, rewind and fast-forward live capture and UDP or TCP/IP streams. ‡MPEG Transport Stream: With DVR for UDP and TCP/IP streams & auto-live support. Develop your application with the same robust imaging technologies used by Microsoft, HP, Sony, Canon, Kodak, GE, Siemens, the US Air Force and Veterans Affairs Hospitals. Vector DICOM Medical Form Recognition & Processing Multimedia Barcode Document Silverlight, .NET, WPF, WCF, WF, C API, C++ Class Lib, COM & more! Install LEADTOOLS to eliminate months of research and programming time while maintaining high levels of quality, performance and functionality. LEADTOOLS provides developers easy access to decades of expertise in color, grayscale, document, medical, vector and multimedia imaging development. Untitled-2 1 12/13/10 4:29 PM msdn magazine 30 WI NDOWS WORKFLOW FOUNDATI ON 4 Authoring Control Flow Activities in WF 4 Control flow refers to the way in which the individual instructions in a program are organized and executed. In Windows Workfow Foundation 4 (WF 4), control fow activities govern the execution semantics of one or more child activities. Some examples in the WF 4 activity toolbox include Sequence, Parallel, If, ForEach, Pick, Flowchart and Switch, among others. Te WF runtime doesn’t have frst-class knowledge of any control fow like Sequence or Parallel. From its perspective, everything is just activities. Te runtime only enforces some simple rules (for example, “an activity can’t complete if any child activities are still running”). WF control fow is based on hierarchy—a WF program is a tree of activities. Control flow options in WF 4 are not limited to the activities shipped in the framework. You can write your own and use them in combination with the ones provided in the box, and that’s what this article will discuss. You’ll learn how to write your own control Leon Welicki fow activities following a “crawl, walk, run” approach: we’ll start with a very simple control flow activity and add richness as we progress, ending up with a new and useful control flow activity. Te source code for all of our examples is available for download. But frst, let’s start with some basic concepts about activities to set some common ground. Activities Activities are the basic unit of execution in a WF program; a work- fow program is a tree of activities that are executed by the WF runtime. WF 4 includes more than 35 activities, a comprehensive set that can be used to model processes or create new activities. Some of those activities govern the semantics of how other activ- i ties are executed (such as Sequence, Flowchart, Parallel and ForEach), and are known as composite activities. Others perform a single atomic task (WriteLine, InvokeMethod and so forth). We call these leaf activities. WF activities are implemented as CLR types, and as such they derive from other existing types. You can author activities visually and declaratively using the WF designer, or imperatively writing CLR code. The base types available to create your own custom activity are defned in the activities type hierarchy in Figure 1. You’ll find a detailed explanation of this type hierarchy in the MSDN Library at msdn.microsoft.com/library/dd560893. In this article I’ll focus on activities that derive from NativeActivity, the base class that provides access to the full breadth of the WF This article discusses: • The activities type hierarchy in Windows Workflow Foundation 4 • Composite and leaf activities • Bookmarks and execution properties • GoTo activities Technologies discussed: Windows Workflow Foundation 4.0 31 January 2011 msdnmagazine.com runtime. Control fow activities are composite activities that derive from the NativeActivity type because they need to interact with the WF runtime. Most commonly this is to schedule other activities (for example, Sequence, Parallel or Flowchart), but it may also include implementing custom cancellation using CancellationScope or Pick, creating bookmarks with Receive and persisting using Persist. Te activity data model defnes a crisp model for reasoning about data when authoring and consuming activities. Data is defned using arguments and variables. Arguments are the binding terminals of an activity and defne its public signature in terms of what data can be passed to the activity (input arguments) and what data will be returned by the activity when it completes its execution (output arguments). Variables represent temporary storage of data. Activity authors use arguments to defne the way data fows in and out of an activity, and they use variables in a couple of ways: • To expose a user-editable collection of variables on an activity defnition that can be used to share variables among multiple activities (such as a Variables collection in Sequence and Flowchart). • To model the internal state of an activity. Workfow authors use arguments to bind activities to the environ- ment by writing expressions, and they declare variables at diferent scopes in the workfow to share data between activities. Together, variables and arguments combine to provide a predictable model of communication between activities. Now that I’ve covered some core activity basics, let’s start with the frst control fow activity. A Simple Control Flow Activity I’ll begin by creating a very simple control flow activity called ExecuteIfTrue. Tere’s not much to this activity: It executes a con- tained activity if a condition is true. WF 4 provides an If activity that includes Ten and Else child activities; ofen we only want to provide the Ten, and the Else is just overhead. For those cases we want an activity that executes another activity based on the value of a Boolean condition. H ere’s how this activity should work: • Te activity user must provide a Boolean condition. Tis argument is required. • Te activity user can provide a body—the activity to be executed if the condition is true. • At execution time: If the condition is true and the body is not null, execute the body. Here’s an implementation for an ExecuteIfTrue activity that behaves in exactly this way: public class ExecuteIfTrue : NativeActivity { [RequiredArgument] public InArgument<bool> Condition { get; set; } public Activity Body { get; set; } public ExecuteIfTrue() { } protected override void Execute(NativeActivityContext context) { if (context.GetValue(this.Condition) && this.Body != null) context.ScheduleActivity(this.Body); } } Tis code is very simple, but there’s more here than meets the eye. ExecuteIfTrue executes a child activity if a condition is true, so it needs to schedule another activity. Terefore, it derives from NativeActivity because it needs to interact with the WF runtime to schedule children. Once you’ve decided the base class for an activity, you have to define its public signature. In ExecuteIfTrue, this consists of a Boolean input argument of type InArgument<bool> named Condition that holds the condition to be evaluated, and a property of type Activity named Body with the activity to be executed if the condition is true. Te Condition argument is decorated with the RequiredArgument attribute that indicates to the WF run- time that it must be set with an expression. Te WF runtime will enforce this validation when preparing the activity for execution: [RequiredArgument] public InArgument<bool> Condition { get; set; } public Activity Body { get; set; } Te most interesting piece of code in this activity is the Execute method, which is where the “action” happens. All NativeActivities must override this method. Te Execute method receives a Native- ActivityContext argument, which is our point of interaction with the WF runtime as activity authors. In ExecuteIfTrue, this con- text is used to retrieve the value of the Condition argument (context.GetValue(this.Condition)) and to schedule the Body using the ScheduleActivity method. Notice that I say schedule and not execute. Te WF runtime does not execute the activities right away; instead it adds them to a list of work items to be scheduled for execution: protected override void Execute(NativeActivityContext context) { if (context.GetValue(this.Condition) && this.Body != null) context.ScheduleActivity(this.Body); } Note as well that the type has been designed following the create- set-use pattern. The XAML syntax is based on this pattern for designing types, where the type has a public default constructor and public read/write properties. Tis means the type will be XAML serialization-friendly. Te following code snippet shows how to use this activity. In this example, if the current day is Saturday, you write the string “Rest!” to the console: Figure 1 Activity Type Hierarchy CodeActivity <TResult> AsyncCodeActivity <TResult> NativeActivity <TResult> Activity<TResult> AsyncCodeActivity NativeActivity CodeActivity Activity msdn magazine 32 Windows Workflow Foundation 4 var act = new ExecuteIfTrue { Condition = new InArgument<bool>(c => DateTime.Now.DayOfWeek == DayOfWeek.Tuesday), Body = new WriteLine { Text = "Rest!" } }; WorkflowInvoker.Invoke(act); T e frst control fow activity has been created in 15 lines of code. But don’t be fooled by the simplicity of that code—it’s actually a fully functional control fow activity! Scheduling Multiple Children Te next challenge is to write a simplifed version of the Sequence activity. Te goal of this exercise is to learn how to write a control fow activity that schedules multiple child activities and executes in multiple episodes. Tis activity is almost functionally equivalent to the Sequence shipped in the product. Here’s how this activity should work: • Te activity user must provide a collection of children to be executed sequentially through the Activities property. • At execution time: • Te activity contains an internal variable with the index of the last item in the collection that has been executed. • If there are items in the children collection, schedule the frst child. • When the child completes: • Increment the index of the last item executed. • If the index is still within the boundaries of the children collection, schedule the next child. • Repeat. T e code in Figure 2 implements a SimpleSequence activity that behaves exactly as described. Again, a fully functional control fow activity has been written in just a few lines of codes—in this case, about 50 lines. Te code is simple, but it introduces some interesting concepts. SimpleSequence executes a collection of child activities in sequen- tial order, so it needs to schedule other activities. Terefore, it derives from NativeActivity because it needs to interact with the runtime to schedule children. The next step is to define the public signature for Simple- Sequence. In this case it consists of a collection of activities (of type Collection<Activity>) exposed through the Activities prop- erty, and a collection of variables (of type Collection<Variable>) exposed through the Variables property. Variables allow sharing of data among all child activities. Note that these properties only have “getters” that expose the collections using a “lazy instantiation” approach (see Figure 3), so accessing these properties never results in a null reference. Tis makes these properties compliant with the create-set-use pattern. There’s one private member in the class that’s not part of the signature: a Variable<int> named “current” that holds the index of the activity being executed: // Pointer to the current item in the collection being executed Variable<int> current = new Variable<int>() { Default = 0 }; Because this information is part of the internal execution state for SimpleSequence, you want to keep it private and not expose it to users of SimpleSequence. You also want it to be saved and restored when the activity is persisted. You use an Implementation- Variable for this. Implementation variables are variables that are internal to an activity. Tey’re intended to be consumed by the activity author, not the activity user. Implementation variables are persisted when public class SimpleSequence : NativeActivity { // Child activities collection Collection<Activity> activities; Collection<Variable> variables; // Pointer to the current item in the collection being executed Variable<int> current = new Variable<int>() { Default = 0 }; public SimpleSequence() { } // Collection of children to be executed sequentially by SimpleSequence public Collection<Activity> Activities { get { if (this.activities == null) this.activities = new Collection<Activity>(); return this.activities; } } public Collection<Variable> Variables { get { if (this.variables == null) this.variables = new Collection<Variable>(); return this.variables; } } protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata.SetChildrenCollection(this.activities); me tadata.SetVariablesCollection(this.variables); metadata.AddImplementationVariable(this.current); } protected override void Execute(NativeActivityContext context) { // Schedule the first activity if (this.Activities.Count > 0) context.ScheduleActivity(this.Activities[0], this. OnChildCompleted); } void OnChildCompleted(NativeActivityContext context, ActivityInstance completed) { // Calculate the index of the next activity to scheduled int currentExecutingActivity = this.current.Get(context); int next = currentExecutingActivity + 1; // If index within boundaries... if (next < this.Activities.Count) { // Schedule the next activity context.ScheduleActivity(this.Activities[next], this. OnChildCompleted); // Store the index in the collection of the activity executing this.current.Set(context, next); } } } Figure 2 The SimpleSequence Activity 33 January 2011 msdnmagazine.com the activity is persisted and restored when the activity is reloaded, without requiring any work on our part. To make this clear—and continuing with the Sequence example—if an instance of Simple- Sequence is persisted, when it wakes up it will “remember” the index of the last activity that has been executed. Te WF runtime can’t automatically know about implementa- tion variables. If you want to use an ImplementationVariable in an activity, you need to explicitly inform the WF runtime. You do this during CacheMetadata method execution. Despite its rather frightening name, CacheMetadata is not that dif cult. Conceptually, it’s actually simple: It’s the method where an activity “introduces itself ” to the runtime. Tink about the If activity for a moment. In CacheMetadata this activity would say: “Hi, I’m the If activity and I have an input argument named Condi- tion, and two children: Ten and Else.” In the SimpleSequence case, SimpleSequence is saying: “Hi, I’m SimpleSequence and I have a collection of child activities, a collection of variables and an implemen- tation variable.” Tere’s no more than that in the SimpleSequence code for CacheMetadata: protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata.SetChildrenCollection(this.activities); metadata.SetVariablesCollection(this.variables); metadata.AddImplementationVariable(this.current); } Te default implementation of CacheMetadata uses refection to get this data from the activity. In the ExecuteIfTrue example, I didn’t implement CacheMetadata and relied on the default implementa- tion to refect on public members. For SimpleSequence, in contrast, I did need to implement it because the default implementation can’t “guess” about my desire to use implementation variables. Te next interesting piece of code in this activity is the Execute method. In this case, if there are activities in the collection, you tell the WF runtime: “Please execute the frst activity in the collection of activi- ties and, when you’re done, invoke the OnChildCompleted method.” You say this in WF terms using NativeActivityContext.ScheduleActivity. Notice that when you schedule an activity, you supply a second argument that’s a CompletionCallback. In simple terms, this is a method that will be called once the activity execution is completed. Again, it’s important to be aware of the diference between scheduling and execution. Te CompletionCallback won’t be called when the activity is scheduled; it will be called when the scheduled activity execution has been completed: protected override void Execute(NativeActivityContext context) { // Schedule the first activity if (this.Activities.Count > 0) context.ScheduleActivity(this.Activities[0], this.OnChildCompleted); } Te OnChildCompleted method is the most interesting part of this activity from a learning perspective, and it’s actually the main reason I’ve included SimpleSequence in this article. Tis method gets the next activity in the collection and schedules it. When the next child is scheduled, a CompletionCallback is provided, which in this case points to this same method. Tus, when a child completes, this method will execute again to look for the next child and execute it. Clearly, execution is happening in pulses or episodes. Because workfows can be persisted and unloaded from memory, there can be a large time diference between two pulses of execu- tion. Moreover, these pulses can be executed in diferent threads, processes or even machines (as persisted instances of a workfow can be reloaded in a diferent process or machine). Learning how to program for multiple pulses of execution is one of the biggest challenges in becoming an expert control fow activity author: void OnChildCompleted(NativeActivityContext context, ActivityInstance completed) { // Calculate the index of the next activity to scheduled int currentExecutingActivity = this.current.Get(context); int next = currentExecutingActivity + 1; // If index within boundaries... if (next < this.Activities.Count) { // Schedule the next activity context.ScheduleActivity(this.Activities[next], this.OnChildCompleted); // Store the index in the collection of the activity executing this.current.Set(context, next); } } The following code snippet shows how to use this activity. In this example, I’m writing three strings to the console (“Hello,” “Workflow” and “!”): var act = new SimpleSequence() { Activities = { new WriteLine { Text = "Hello" }, new WriteLine { Text = "Workflow" }, new WriteLine { Text = "!" } } }; WorkflowInvoker.Invoke(act); I’ve authored my own SimpleSequence! Now it’s time to move on to the next challenge. Implementing a New Control Flow Pattern Next, I’ll create a complex control fow activity. As I mentioned earlier, you’re not limited to the control fow activities shipped in WF 4. Tis section demonstrates how to build your own control fow activity to support a control fow pattern that’s not supported out-of-the-box by WF 4. Te new control fow activity will be called Series. Its goal is simple: to provide a Sequence with support for GoTos, where the public Collection<Activity> Activities { get { if (this.activities == null) this.activities = new Collection<Activity>(); return this.activities; } } public Collection<Variable> Variables { get { if (this.variables == null) this.variables = new Collection<Variable>(); return this.variables; } } Figure 3 A Lazy Instantiation Approach msdn magazine 34 Windows Workflow Foundation 4 next activity to be executed can be manipulated either explicitly from inside the workfow (through a GoTo activity) or from the host (by resuming a well-known bookmark). To implement this new control flow, I’ll need to author two activities: Series, a composite activity that contains a collection of activities and executes them sequentially (but allows jumps to any item in the sequence), and GoTo, a leaf activity that I’ll use inside of Series to explicitly model the jumps. To recap, I’ll enumerate the goals and requirements for the custom control activity: 1. It’s a Sequence of activities. 2. It can contain GoTo activities (at any depth), which change the point of execution to any direct child of the Series. 3. It can receive GoTo messages from the outside (for example, from a user), which can change the point of execution to any direct child of the Series. I’ll sta rt by implementing the Series activity. Here’s the execution semantics in simple terms: • Te activity user must provide a collection of children to be executed sequentially through the Activities property. • In the execute method: • Create a bookmark for the GoTo in a way that’s available to child activities. • Te activity contains an internal variable with the activity instance being executed. • If there are items in the children collection, schedule the frst child. • When the child completes: • Lookup the completed a ctivity in the Activities collection. • Increment the index of the last item executed. • If the index is still within the boundaries of the children collection, schedule the next child. • Repeat. • If the GoTo bookmark is resumed: • Get the name of the activity we want to go to. • Find that activity in the activities collection. • Schedule the target activity in the set for execution and register a completion callback that will schedule the next activity. • Cancel the activity that’s currently executing. • Store the activity that’s currently being executed in the “current” variable. Te code example in Figure 4 shows the implementation for a Series activity that behaves exactly as described. Some of this code will look familiar from the previous examples. I’ll discuss the implementation of this activity. Series derives from NativeActivity because it needs to interact with the WF runtime to schedule child activities, create bookmarks, cancel children and use execution properties. As before, the next step is to defne the public signature for Series. As in SimpleSequence, there are Activities and Variables collection prop- erties. Tere’s also a string input argument named BookmarkName (of type InArgument<string>), with the name of the bookmark to be created for host resumption. Again, I’m following the create-set-use pattern in the activity type. Series has a private member named “current” that contains the ActivityInstance being executed, instead of just a pointer to an item in a collection, as in SimpleSequence. Why is current a Variable<ActivityInstance> and not a Variable<int>? Because I need to get ahold of the currently executing child later in this activity during the GoTo method. I’ll explain the actual details later; the important thing to understand now is that I’ll have an imple- mentation variable that holds the activity instance being executed: Variable<ActivityInstance> current = new Variable<ActivityInstance>(); In CacheMetadata you provide runtime information about your activity: the children and variables collections, the implementation variable with the current activity instance and the bookmark name argument. Te only diference from the previous example is that I’m manually registering the BookmarkName input argument within the WF runtime—adding a new instance of RuntimeArgument to the activity metadata: protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata.SetVariablesCollection(this.Variables); metadata.SetChildrenCollection(this.Activities); metadata.AddImplementationVariable(this.current); metadata.AddArgument(new RuntimeArgument("BookmarkName", typeof(string), ArgumentDirection.In)); } Te next new thing is the CanInduceIdle property overload. Tis is just more metadata that the activity provides to the WF runtime. When this property returns true, I’m telling the runtime that this activity can cause the workfow to become idle. I need to override this property and return true for activities that create bookmarks, as they’ll make the workfow go idle waiting for its resumption. Te default value for this property is false. If this property returns false and we create a bookmark, I’ll have an InvalidOperationException exception when executing the activity: protected override bool CanInduceIdle { get { return true; } } Tings get more interesting in the Execute method, where I cre- ate a bookmark (internalBookmark) and store it in an execution property. Before going further, though, let me introduce bookmarks and execution properties. Bookmarks are the mechanism by which an activity can passively wait to be resumed. When an activity wishes to “block” pending a certain event, it registers a bookmark and then returns an execution status of continuing. Tis signals the runtime that although the activity’s execution is not complete, it doesn’t have any more work to do as part of the current work item. When you use bookmarks, you can author your activities using a form of reactive execution: when the bookmark is created the activity yields, and when the bookmark is resumed a block of code (the bookmark resumption callback) is invoked in reaction to the bookmark resumption. Unlike programs directly targeting the CLR, workfow programs are hierarchically scoped trees that execute in a thread-agnostic environment. Tis implies that the standard thread local storage (TLS) mechanisms can’t be directly leveraged to determine what context is in scope for a given work item. Te workfow execution context introduces execution properties to an activity’s environment, so that an activity can declare properties that are in scope for its sub-tree and share them with its children. As a result, an activity can share data with its descendants through these properties. 35 January 2011 msdnmagazine.com Now that you know about bookmarks and execution properties, let’s get back to the code. What I’m doing at the beginning of the Execute method is creating a bookmark (using context.CreateBookmark) and saving it into an execution property (using context.Properties.Add). This bookmark is a multiple-resume bookmark, meaning it can be resumed multiple times and it will be available while its parent activity is in an executing state. It’s also NonBlocking, so it won’t prevent the activity from completing once it’s done with its job. When that bookmark is resumed, the GoTo method will be called because I provided a BookmarkCompletionCallback to CreateBookmark (the frst parameter). Te reason to save it into an execution property is to make it available to all child activities. (You’ll see later how the GoTo activity uses this bookmark.) Notice that execution properties have names. Because that name is a string, I defned a constant (Goto- PropertyName) with the name for the property in the activity. Tat name follows a fully qualifed name approach. Tis is a best practice: internal static readonly string GotoPropertyName = "Microsoft.Samples.CustomControlFlow. Series.Goto"; ... ... // Create a bookmark for signaling the GoTo Bookmark internalBookmark = context.CreateBookmark(this.Goto, BookmarkOptions.MultipleResume | BookmarkOptions. NonBlocking); // Save the name of the bookmark as an execution property context.Properties.Add(GotoPropertyName, internalBookmark); Once I’ve declared the bookmark, I’m ready to schedule my frst activity. I’m already familiar with this, because I did it in my previous public class Series : NativeActivity { internal static readonly string GotoPropertyName = "Microsoft.Samples.CustomControlFlow.Series.Goto"; // Child activities and variables collections Collection<Activity> activities; Collection<Variable> variables; // Activity instance that is currently being executed Variable<ActivityInstance> current = new Variable<ActivityInstance>(); // For externally initiated goto's; optional public InArgument<string> BookmarkName { get; set; } public Series() { } public Collection<Activity> Activities { get { if (this.activities == null) this.activities = new Collection<Activity>(); return this.activities; } } public Collection<Variable> Variables { get { if (this.variables == null) this.variables = new Collection<Variable>(); return this.variables; } } protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata.SetVariablesCollection(this.Variables); metadata.SetChildrenCollection(this.Activities); metadata.AddImplementationVariable(this.current); metadata.AddArgument(new RuntimeArgument("BookmarkName", typeof(string), ArgumentDirection.In)); } protected override bool CanInduceIdle { get { return true; } } protected override void Execute(NativeActivityContext context) { // If there activities in the collection... if (this.Activities.Count > 0) { // Create a bookmark for signaling the GoTo Bookmark internalBookmark = context.CreateBookmark(this.Goto, BookmarkOptions.MultipleResume | BookmarkOptions. NonBlocking); // Save the name of the bookmark as an execution property context.Properties.Add(GotoPropertyName, internalBookmark); // Schedule the first item in the list and save the resulting // ActivityInstance in the "current" implementation variable this.current.Set(context, context.ScheduleActivity(this.Activities[0], this.OnChildCompleted)); // Create a bookmark for external (host) resumption if (this.BookmarkName.Get(context) != null) context.CreateBookmark(this.BookmarkName.Get(context), this.Goto, BookmarkOptions.MultipleResume | BookmarkOptions.NonBlocking); } } void Got o(NativeActivityContext context, Bookmark b, object obj) { // Get the name of the activity to go to string targetActivityName = obj as string; // Find the activity to go to in the children list Activity targetActivity = this.Activities .Where<Activity>(a => a.DisplayName. Equals(targetActivityName)) .Single(); // Schedule the activity ActivityInstance instance = context.ScheduleActivity(targetActivity, this. OnChildCompleted); // Cancel the activity that is currently executing context.CancelChild(this.current.Get(context)); // Set the activity that is executing now as the current this.current.Set(context, instance); } void OnChildCompleted(NativeActivityContext context, ActivityInstance completed) { // This callback also executes when cancelled child activities complete if (completed.State == ActivityInstanceState.Closed) { // Find the next activity and execute it int completedActivityIndex = this.Activities.IndexOf(completed.Activity); int next = completedActivityIndex + 1; if (next < this.Activities.Count) this.current.Set(context, context.ScheduleActivity(this.Activities[next], this.OnChildCompleted)); } } } Figure 4 The Series Activity msdn magazine 36 Windows Workflow Foundation 4 activities. I’ll schedule the frst activity in the collection and tell the runtime to invoke the OnChildCompleted method when the activ- ity is done (as I did in SimpleSequence). Context.ScheduleActivity returns an ActivityInstance that represents an instance of an activity being executed, which I assign to our current implementation vari- able. Let me clarify this a bit. Activity is the defnition, like a class; ActivityInstance is the actual instance, like an object. We can have multiple ActivityInstances from the same Activity: // Schedule the first item in the list and save the resulting // ActivityInstance in the "current" implementation variable this.current.Set(context, context.ScheduleActivity(this.Activities[0], this.OnChildCompleted)); Finally, we create a bookmark that can be used by the host to jump to any activity within the series. Te mechanics for this are simple: Because the host knows the name of the bookmark, it can resume it with a jump to any activity within the Series: // Create a bookmark for external (host) resumption if (this.BookmarkName.Get(context) != null) context.CreateBookmark(this.BookmarkName.Get(context), this.Goto, BookmarkOptions.MultipleResume | BookmarkOptions.NonBlocking); Te OnChildCompleted method should be straightforward now, as it’s very similar to the one in SimpleSequence: I look up the next element in the activities collection and schedule it. Te main diference is that I only schedule the next activity if the current activity successfully completed its execution (that is, reached the closed state and hasn’t been canceled or faulted). Te GoTo method is arguably the most interesting. Tis is the method that gets executed as the result of the GoTo bookmark being resumed. It receives some data as input, which is passed when the bookmark is resumed. In this case, the data is the name of the activity we want to go to: void Goto(NativeActivityContext context, Bookmark b, object data) { // Get the name of the activity to go to string targetActivityName = data as string; ... } The target activity name is the DisplayName property of the activity. I look up the requested activity defnition in the “activities” collection. Once I fnd the requested activity, I schedule it, indicat- ing that when the activity is done, the OnChildCompleted method should be executed: // Find the activity to go to in the children list Activity targetActivity = this.Activities .Where<Activity>(a => a.DisplayName. Equals(targetActivityName)) .Single(); // Schedule the activity ActivityInstance instance = context.ScheduleActivity(targetActivity, this. OnChildCompleted); Next, I cancel the activity instance that’s currently being executed and set the current activity being executed to the ActivityInstance scheduled in the previous step. For both these tasks I use the “cur- rent” variable. First, I pass it as a parameter of the CancelChild method of NativeActivityContext, and then I update its value with the ActivityInstance that has been scheduled in the previous block of code: // Cancel the activity that is currently executing context.CancelChild(this.current.Get(context)); // Set the activity that is executing now as the current this.current.Set(context, instance); The GoTo Activity Te GoTo activity can be used only inside of a Series activity to jump to an activity in its Activities collection. It’s similar to a GoTo statement in an imperative program. Te way it works is very simple: It resumes the GoTo bookmark created by the Series activity in which it’s con- tained, indicating the name of the activity we want to go to. When the bookmark is resumed, the Series will jump to the activity indicated. Here’s a simple description of the execution semantics: • Te activity user must provide a string TargetActivityName. Tis argument is required. • At execution time: • Te GoTo activity will locate the “GoTo” bookmark created by the Series activity. • If the bookmark is found, it will resume it, passing the TargetActivityName. • It will create a synchronization bookmark, so the activity does not complete. • It will be cancelled by the Series. The code in Figure 5 shows the implementation for a GoTo activity that behaves exactly as described. GoTo derives from NativeActivity because it needs to interact with the WF runtime to create and resume bookmarks and use execution properties. Its public signature consists of the Target- ActivityName string input argument that contains the name of the activity we want to jump to. I decorated that argument with public class GoTo : NativeActivity { public GoTo() { } [RequiredArgument] public InArgument<string> TargetActivityName { get; set; } protected override bool CanInduceIdle { get { return true; } } protected override void Execute(NativeActivityContext context) { // Get the bookmark created by the parent Series Bookmark bookmark = context.Properties.Find(Series.GotoPropertyName) as Bookmark; // Resume the bookmark passing the target activity name context.ResumeBookmark(bookmark, this.TargetActivityName. Get(context)); // Create a bookmark to leave this activity idle waiting when it does // not have any further work to do. Series will cancel this activity // in its GoTo method context.CreateBookmark("SyncBookmark"); } } Figure 5 The GoTo Activity The GoTo method is arguably the most interesting. DESIGN Design Applications That Help Run the Business Our xamMap™ control in Silverlight and WPF lets you map out any geospatial data like this airplane seating app to manage your business. Come to infragistics.comto try it today! Infragistics Sales 800 231 8588 Infragistics Europe Sales +44 (0) 800 298 9055 Infragistics India +91 80 4151 8042 @infragistics Untitled-6 1 11/10/10 11:41 AM msdn magazine 38 Windows Workflow Foundation 4 the RequiredArgument attribute, meaning that the WF validation services will enforce that it’s set with an expression. I rely on the default implementation of CacheMetadata that reflects on the public surface of the activity to find and register runtime metadata. Te most important part is in the Execute method. I frst look for the bookmark created by the parent Series activity. Because the bookmark was stored as an execution property, I look for it in context.Properties. Once I fnd that bookmark, I resume it, passing the TargetActivityName as input data. Tis bookmark resumption will result in the Series.Goto method being invoked (because it’s the bookmark callback supplied when the bookmark was created). Tat method will look for the next activity in the collection, schedule it and cancel the activity that’s currently executing: // Get the bookmark created by the parent Series Bookmark bookmark = context.Properties.Find(Series.GotoPropertyName) as Bookmark; // Resume the bookmark passing the target activity name context.ResumeBookmark(bookmark, this.TargetActivityName.Get(context)); Te fnal line of code is the trickiest one: creating a bookmark for synchronization that will keep the GoTo activity running. Hence, when the GoTo.Execute method is complete, this activity will still be in executing state, waiting for a stimulus to resume the bookmark. When I discussed the code for Series.Goto, I mentioned that it cancelled the activity being executed. In this case, Series.Goto is actually canceling a Goto activity instance that’s waiting for this bookmark to be resumed. To explain in more detail: Te instance of GoTo activity was sched- uled by the Series activity. When this activity completes, the comple- tion callback in Series (OnChildCompleted) looks for the next activity in Series.Activities collection and schedules it. In this case I don’t want to schedule the next activity—I want to schedule the activity refer- enced by TargetActivityName. Tis bookmark enables this because it keeps the GoTo activity in an executing state while the target activity is being scheduled. When GoTo is cancelled, there’s no action in the Series.OnChildCompleted callback because it only schedules the next activity if the completion state is Closed (and, in this case, is Cancelled): // Create a bookmark to leave this activity idle waiting when it does // not have any further work to do. Series will cancel this activity // in its GoTo method context.CreateBookmark(“SyncBookmark”); Figure 6 shows an example using this activity. In this case, I’m loop- ing back to a previous state according to the value of a variable. Tis is a simple example to illustrate the basic use of Series, but this activ- ity can be used to implement complex, real-world business scenarios where you need to skip, redo or jump to steps in a sequential process. Go with the Flow In this article I presented the general aspects of writing custom control fow activities. In WF 4, the control fow spectrum is not fxed; writing custom activities has been dramatically simplifed. If the activities provided out-of-the-box don’t meet your needs, you can easily create your own. In this article I started with a simple control fow activity and then worked my way up to implement a custom control fow activity that adds new execution semantics to WF 4. If you want to learn more, a Community Technology Preview for State Machine is available at CodePlex with full source code. You’ll also fnd a series of Channel 9 videos on activity authoring best practices. By writing your own custom activities, you can express any control fow pattern in WF and accommodate WF to the particulars of your problem. „ LEON WELICKI is a program manager in the Windows Workfow Foundation (WF) team at Microsof working on the WF runtime. Prior to joining Microsof, he worked as lead architect and dev manager for a large Spanish telecom company and as an external associate professor on the graduate computer science faculty at the Pontifcal University of Salamanca at Madrid. THANKS to the following technical experts for reviewing this article: Joe Clancy, Dan Glick, Rajesh Sampath, Bob Schmidt and Isaac Yuen var counter = new Variable<int>(); var act = new Series { Variables = { counter}, Activities = { new WriteLine { DisplayName = "Start", Text = "Step 1" }, new WriteLine { DisplayName = "First Step", Text = "Step 2" }, new Assign<int> { To = counter, Value = new InArgument<int>(c => counter.Get(c) + 1) }, new If { Condition = new InArgument<bool>(c => counter.Get(c) == 3), Then = new WriteLine { Text = "Step 3" }, Else = new GoTo { TargetActivityName = "First Step" } }, new WriteLine { Text = "The end!" } } }; WorkflowInvoker.Invoke(act); Figure 6 Using GoTo in a Series Windows Workflow Foundation 4 Developer Center msdn.microsoft.com/netframework/aa663328 Endpoint.tv: Activities Authoring Best Practices channel9.msdn.com/shows/Endpoint/endpointtv-Workfow-and-Custom- Activities-Best-Practices-Part-1/ Designing and Implementing Custom Activities msdn.microsoft.com/library/dd489425 ActivityInstance Class msdn.microsoft.com/library/system.activities.activityinstance RuntimeArgument Class msdn.microsoft.com/library/dd454495 References DEVELOP Rich Business Intelligence Applications in WPF and Silverlight Infragistics Sales 800 231 8588 Infragistics Europe Sales +44 (0) 800 298 9055 Infragistics India +91 80 4151 8042 @infragistics Robust Pivot Grids for WPF and Silverlight let your users analyze data to make key business decisions. Visit infragistics.comto try it today! Untitled-6 1 11/10/10 10:57 AM msdn magazine 40 SI LVERLI GHT EXPOSED Using MEF to Expose Interfaces in Your Silverlight MVVM Apps While many developers may think of Silverlight as a Web-centric technology, in practice it has become a great platform for building any type of application. Silverlight has built-in support for concepts such as data binding, value converters, navigation, out-of-browser operation and COM Interop, making it relatively straightforward to create all kinds of applications. And when I say all kinds, I also mean enterprise applications. Creating a Silverlight application with the Model-View-View- Model (MVVM) pattern gives you, in addition to the features already in Silverlight, the advantages of greater maintainability, testability and separation of your UI from the logic behind it. And, of course, you don’t have to fgure all this out on your own. Tere’s a wealth of information and tools out there to help you get started. For example, the MVVM Light Toolkit (mvvmlight.codeplex.com) is a lightweight framework for implementing MVVM with Silverlight and Windows Presentation Foundation (WPF), and WCF RIA Services (silverlight.net/getstarted/riaservices) helps you easily access Sandrino Di Mattia Windows Communication Foundation (WCF) services and data- bases thanks to code generation. You can take your Silverlight application a step further with the Managed Extensibility Framework (mef.codeplex.com), also known as MEF. Tis framework provides the plumbing to create extensible applications using components and composition. In the rest of the article I’ll show you how to use MEF to get cen- tralized management of the View and ViewModel creation. Once you have this, you can go much further than just putting a ViewModel in the DataContext of the View. All this will be done by customizing the built-in Silverlight navigation. When the user navigates to a given URL, MEF intercepts this request, looks at the route (a bit like ASP.NET MVC), fnds a matching View and ViewModel, notifes the ViewModel of what’s happening and displays the View. Getting Started with MEF Because MEF is the engine that will connect all the parts of this example, it’s best to start with it. If you’re not familiar with MEF already, start by reading Glenn Block’s article, “Building Composable Apps in .NET 4 with the Managed Extensibility Framework,” in the February 2010 issue of MSDN Magazine (msdn.microsoft.com/magazine/ee291628). First you need to correctly confgure MEF when the application starts by handling the Startup event of the App class: private void OnStart(object sender, StartupEventArgs e) { // Initialize the container using a deployment catalog. var catalog = new DeploymentCatalog(); var container = CompositionHost.Initialize(catalog); // Export the container as singleton. container.ComposeExportedValue<CompositionContainer>(container); // Make sure the MainView is imported. CompositionInitializer.SatisfyImports(this); } This article discusses: • Getting started with MEF • Extending Silverlight and MEF • Custom navigation • Handling the View and ViewModel Technologies discussed: ASP.NET, Silverlight Code download available at: code.msdn.microsoft.com/mag201101MEF EXPERIENCE Beautiful Data Visualizations That Bring Your Data to Life Use our Motion Framework ™ to see your data over time and give your users new insight into their data. Visit infragistics.com to try it today! Copyright 1996-2010 Infragistics, Inc. All rights reserved. Infragistics, the Infragistics logo and NetAdvantage are registered trademarks of Infragistics, Inc. Motion Framework is a trademark of Infragistics, Inc. Infragistics Sales 800 231 8588 Infragistics Europe Sales +44 (0) 800 298 9055 Infragistics India +91 80 4151 8042 @infragistics Untitled-6 1 11/10/10 10:57 AM msdn magazine 42 Silverlight Exposed Te deployment catalog makes sure all assemblies are scanned for exports and is then used to create a CompositionContainer. Because the navigation will require this container to do some work later on, it’s important to register the instance of this container as an exported value. Tis will allow the same container to be imported whenever required. Another option would be to store the container as a static object, but this would create tight coupling between the classes, which isn’t suggested. Extending Silverlight Navigation Silverlight Navigation Application is a Visual Studio template that enables you to quickly create applications that support navigation using a Frame that hosts the content. Te great thing about Frames is that they integrate with the Back and Forward buttons of your browser and they support deep linking. Look at the following: <navigation:Frame x:Name="ContentFrame" Style="{StaticResource ContentFrameStyle}" Source="Customers" NavigationFailed="OnNavigationFailed"> <i:Interaction.Behaviors> <fw:CompositionNavigationBehavior /> </i:Interaction.Behaviors> </navigation:Frame> Tis is just a regular frame that starts by navigating to Customers. As you can see, this Frame doesn’t contain a UriMapper (where you could link Customers to a XAML fle, such as /Views/Customers.aspx). Te only thing it contains is my custom behavior, CompositionNavigation- Behavior. A behavior (from the System.Windows.Interactivity assem- bly) allows you to extend existing controls, such as a Frame in this case. Figure 1 shows the behavior. Let’s take a look at what this CompositionNavigationBehavior does. The first thing you can see is that the behavior wants a CompositionContainer and a CompositionNavigationLoader (more on this later) because of the Import attributes. Te constructor then forces the Import using the SatisfyImports method on the CompositionInitializer. Note that you should only use this method when you don’t have another choice, as it actually couples your code to MEF. When the Frame is attached, a NavigationService is created and wrapped around the Frame. Using ComposeExportedValue, the instance of this wrapper is registered in the container. When the container was created, the instance of this container was also registered in itself. As a result, an Import of Composition- Container will always give you the same object; this is why I used ComposeExportedValue in the Startup event of the App class. Now the CompositionNavigationBehavior asks for a CompositionContainer using the Import attribute and will get it afer SatisfyImports runs. When registering the instance of INavigationService the same thing happens. It’s now possible from anywhere in the application to ask for an INavigationService (that wraps around a Frame). Without having to couple your ViewModels to a frame you get access to the following: public interface INavigationService { void Navigate(string path); void Navigate(string path, params object[] args); } public class CompositionNavigationBehavior : Behavior<Frame> { private bool processed; [Import] public CompositionContainer Container { get; set; } [Import] public CompositionNavigationContentLoader Loader { get; set; } public CompositionNavigationBehavior() { if (!DesignerProperties.IsInDesignTool) CompositionInitializer.SatisfyImports(this); } protected override void OnAttached() { base.OnAttached(); if (!processed) { this.RegisterNavigationService(); this.SetContentLoader(); processed = true; } } private void RegisterNavigationService() { var frame = AssociatedObject; var svc = new NavigationService(frame); Container.ComposeExportedValue<INavigationService>(svc); } private void SetContentLoader() { var frame = AssociatedObject; frame.ContentLoader = Loader; frame.JournalOwnership = JournalOwnership.Automatic; } } Figure 1 CompositionNavigationBehavior [MetadataAttribute] [AttributeUsage(AttributeTargets.Class, AllowMultiple = false)] public class ViewModelExportAttribute : ExportAttribute, IViewModelMetadata { ..public Type ViewModelContract { get; set; } public string NavigationPath { get; set; } public string Key { get; set; } public ViewModelExportAttribute(Type viewModelContract, string navigationPath) : base(typeof(IViewModel)) { this.NavigationPath = navigationPath; this.ViewModelContract = viewModelContract; if (NavigationPath != null && NavigationPath.Contains("/")) { // Split the path to get the arguments. var split = NavigationPath.Split(new char[] { '/' }, StringSplitOptions.RemoveEmptyEntries); // Get the key. Key = split[0]; } else { // No arguments, use the whole key. Key = NavigationPath; } } } Figure 2 Creating the ViewModelExportAttribute MEF is the engine that will connect all the parts of this example. 43 January 2011 msdnmagazine.com Now, let’s assume you have a ViewModel showing all of your customers and this ViewModel should be able to open a specifc customer. Tis could be achieved using the following code: [Import] public INavigationService NavigationService { get; set; } private void OnOpenCustomer() { NavigationService.Navigate( "Customer/{0}", SelectedCustomer.Id); } But before jumping ahead, let’s discuss the SetContentLoader method in the CompositionNavigationBehavior. It changes the ContentLoader of the Frame. This is a perfect example of the support for extensibility in Silverlight. You can provide your own ContentLoader (that implements the INavigationContentLoader interface) to really provide something to show in the Frame. Now that you can see how things start falling into place, the following topic—extending MEF—will become clear. Back to Extending MEF Te goal here is that you can navigate to a certain path (be it from the ViewModel or your browser address bar) and the Composition- NavigationLoader does the rest. It should parse the URI, find a matching ViewModel and a matching View, and combine them. Normally you’d write something like this: [Export(typeof(IMainViewModel))] public class MainViewModel In this case it would be interesting to use the Export attribute with some extra confguration, referred to as metadata. Figure 2 shows an example of a metadata attribute. Tis attribute doesn’t do anything special. In addition to the ViewModel interface, it allows you to defne a navigation path such [Export] public class CompositionNavigationContentLoader : INavigationContentLoader { [ImportMany(typeof(IView))] public IEnumerable<ExportFactory<IView, IViewMetadata>> ViewExports { get; set; } [ImportMany(typeof(IViewModel))] public IEnumerable<ExportFactory<IViewModel, IViewModelMetadata>> ViewModelExports { get; set; } public bool CanLoad(Uri targetUri, Uri currentUri) { return true; } public void CancelLoad(IAsyncResult asyncResult) { return; } public IAsyncResult BeginLoad(Uri targetUri, Uri currentUri, AsyncCallback userCallback, object asyncState) { // Convert to a dummy relative Uri so we can access the host. var relativeUri = new Uri("http://" + targetUri.OriginalString, UriKind.Absolute); // Get the factory for the ViewModel. var viewModelMapping = ViewModelExports.FirstOrDefault(o => o.Metadata.Key.Equals(relativeUri.Host, StringComparison.OrdinalIgnoreCase)); if (viewModelMapping == null) throw new InvalidOperationException( String.Format("Unable to navigate to: {0}. " + "Could not locate the ViewModel.", targetUri.OriginalString)); // Get the factory for the View. var viewMapping = ViewExports.FirstOrDefault(o => o.Metadata.ViewModelContract == viewModelMapping.Metadata.ViewModelContract); if (viewMapping == null) throw new InvalidOperationException( String.Format("Unable to navigate to: {0}. " + "Could not locate the View.", targetUri.OriginalString)); // Resolve both the View and the ViewModel. var viewFactory = viewMapping.CreateExport(); var view = viewFactory.Value as Control; var viewModelFactory = viewModelMapping.CreateExport(); var viewModel = viewModelFactory.Value as IViewModel; // Attach ViewModel to View. view.DataContext = viewModel; viewModel.OnLoaded(); // Get navigation values. var values = viewModelMapping.Metadata.GetArgumentValues(targetUri); viewModel.OnNavigated(values); if (view is Page) { Page page = view as Page; page.Title = viewModel.GetTitle(); } else if (view is ChildWindow) { ChildWindow window = view as ChildWindow; window.Title = viewModel.GetTitle(); } // Do not navigate if it's a ChildWindow. if (view is ChildWindow) { ProcessChildWindow(view as ChildWindow, viewModel); return null; } else { // Navigate because it's a Control. var result = new CompositionNavigationAsyncResult(asyncState, view); userCallback(result); return result; } } private void ProcessChildWindow(ChildWindow window, IViewModel viewModel) { // Close the ChildWindow if the ViewModel requests it. var closableViewModel = viewModel as IClosableViewModel; if (closableViewModel != null) { closableViewModel.CloseView += (s, e) => { window.Close(); }; } // Show the window. window.Show(); } public LoadResult EndLoad(IAsyncResult asyncResult) { return new LoadResult((asyncResult as CompositionNavigationAsyncResult).Result); } } Figure 3 Custom INavigationContentLoader Your attribute should implement an interface with the values you’ll want to expose as metadata. msdn magazine 44 Silverlight Exposed as Customer/{Id}. Ten it will process this path using Customer as Key and {Id} as one of the arguments. Here’s an example of how this attribute is used: [ViewModelExport(typeof(ICustomerDetailViewModel), "Customer/{id}")] public class CustomerDetailViewModel : ICustomerDetailViewModel Before continuing, there are a few important things to note. First, your attribute should be decorated with the [MetadataAttribute] to work correctly. Second, your attribute should implement an interface with the values you’ll want to expose as metadata. And fnally, mind the constructor of the attribute—it passes a type to the base constructor. Te class that’s decorated with this attribute will be exposed using this type. In the case of my example, this would be IViewModel. Tat’s it for exporting the ViewModels. If you want to import them somewhere, you should be writing something like this: [ImportMany(typeof(IViewModel))] public List<Lazy<IViewModel, IViewModelMetadata>> ViewModels { get; set; } Tis will give you a list that contains all exported ViewModels with their respective metadata, allowing you to enumerate the list and maybe pick out only the ones of interest to you (based on the metadata). In fact, the Lazy object will make sure that only the ones of interest are actually instantiated. Te View will need something similar: [MetadataAttribute] [AttributeUsage(AttributeTargets.Class, AllowMultiple = false)] public class ViewExportAttribute : ExportAttribute, IViewMetadata { public Type ViewModelContract { get; set; } public ViewExportAttribute() : base(typeof(IView)) { } } There’s nothing special in this example, either. This attribute allows you to set the contract of the ViewModel to which the View should be linked. Here’s an example of AboutView: [ViewExport(ViewModelContract = typeof(IAboutViewModel))] public partial class AboutView : Page, IView { public AboutView() { InitializeComponent(); } } A Custom INavigationContentLoader Now that the overall architecture has been set up, let’s take a look at controlling what’s loaded when a user navigates. To create a custom content loader, the following interface needs to be implemented: public interface INavigationContentLoader { IAsyncResult BeginLoad(Uri targetUri, Uri currentUri, AsyncCallback userCallback, object asyncState); void CancelLoad(IAsyncResult asyncResult); bool CanLoad(Uri targetUri, Uri currentUri); LoadResult EndLoad(IAsyncResult asyncResult); } Te most important part of the interface is the BeginLoad method, because this method should return an AsyncResult containing the item that will be displayed in the Frame. Figure 3 shows the imple- mentation of the custom INavigationContentLoader. As you can see, a lot happens in this class—but it’s actually simple. Te frst thing to notice is the Export attribute. Tis is required to be able to import this class in the CompositionNavigationBehavior. Te most important parts of this class are the ViewExports and ViewModelExports properties. These enumerations contain all exports for the Views and the ViewModels, including their metadata. Instead of using a Lazy object I’m using an ExportFactory. Tis is a huge diference! Both classes will only instantiate the object when required, but the diference is that with the Lazy class you can only create a single instance of the object. Te ExportFactory (named afer the Factory pattern) is a class that allows you to request a new instance of the type of object whenever you feel like it. Finally, there’s the BeginLoad method. Tis is where the magic happens. Tis is the method that will provide the Frame with the content to display afer navigating to a given URI. Creating and Processing Objects Let’s say you tell the frame to navigate to Customers. Tis will be what you’ll fnd in the targetUri argument of the BeginLoad method. Once you have this you can get to work. Te frst thing to do is fnd the correct ViewModel. Te ViewModel- Exports property is an enumeration that contains all the exports with their metadata. Using a lambda expression you can fnd the correct ViewModel based on its key. Remember the following: [ViewModelExport(typeof(ICustomersViewModel), "Customers")] public class CustomersViewModel : ContosoViewModelBase, ICustomersViewModel Well, imagine you navigate to Customers. Ten the following code will fnd the right ViewModel: var viewModelMapping = ViewModelExports.FirstOrDefault(o => o.Metadata. Key.Equals("Customers", StringComparison.OrdinalIgnoreCase)); Once the ExportFactory is located, the same thing should hap- pen for the View. However, instead of looking for the navigation key, you look for the ViewModelContract as defned in both the ViewModelExportAttribute and the ViewModelAttribute: Figure 4 Extension Methods for Navigation Arguments Figure 5 Setting a Custom Window Title Instead of using a Lazy object I’m using an ExportFactory. 45 January 2011 msdnmagazine.com [ViewExport(ViewModelContract = typeof(IAboutViewModel)) public partial class AboutView : Page Once both ExportFactories are found, the hard part is over. Now the CreateExport method allows you to create a new instance of the View and the ViewModel: var viewFactory = viewMapping.CreateExport(); var view = viewFactory.Value as Control; var viewModelFactory = viewModelMapping.CreateExport(); var viewModel = viewModelFactory.Value as IViewModel; Afer both the View and the ViewModel have been created, the ViewModel is stored in the DataContext of the View, starting the required data bindings. And the OnLoaded method of the ViewModel is called to notify the ViewModel that all the heavy lifing has been done, and also that all Imports—if there are any—have been imported. You shouldn’t underestimate this last step when you’re using the Import and ImportMany attributes. In many cases you’ll want to do something when creating a ViewModel, but only when everything has been loaded correctly. If you’re using an Importing- Constructor you defnitely know when all Imports were imported (that would be when the constructor is called). But when working with the Import/ImportMany attributes, you should start writing code in all your properties to set fags in order to know when all properties have been imported. In this case the OnLoaded method solves this issue for you. Passing Arguments to the ViewModel Take a look at the IViewModel interface, and pay attention to the OnNavigated method: public interface IViewModel { void OnLoaded(); void OnNavigated(NavigationArguments args); string GetTitle(); } When you navigate to Customers/1, for example, this path is parsed and the arguments are combined in the NavigationArguments class (this is just a Dictionary with extra methods like GetInt, GetString and so on). Because it’s mandatory that each ViewModel implements the IViewModel interface, it’s possible to call the OnNavigated method afer resolving the ViewModel: // Get navigation values. var values = viewModelMapping.Metadata.GetArgumentValues(targetUri); viewModel.OnNavigated(values); When the CustomersViewModel wants to open a Customer- DetailViewModel, the following happens: NavigationService.Navigate("Customer/{0}", SelectedCustomer.Id); Tese arguments then arrive in the CustomerDetailViewModel and can be used to pass to the DataService, for example: public override void OnNavigated(NavigationArguments args) { var id = args.GetInt("Id"); if (id.HasValue) { Customer = DataService.GetCustomerById(id.Value); } } To find the arguments, I wrote a class containing two exten- sion methods that do some work based on the information in the ViewModel metadata (see Figure 4). This proves again that the metadata concept in MEF is really useful. The Final Chores If the View is a Page or a ChildWindow, the title of this control will also be extracted from the IViewModel object. Tis allows you to dynamically set the titles of your Pages and ChildWindows based on the current customer, as shown in Figure 5. Afer all these great little things, there’s one last step. If the View is a ChildWindow the window should be displayed. But if the ViewModel implements IClosableViewModel, the CloseView event of this ViewModel should be linked to the Close method on the ChildWindow. Te IClosableViewModel interface is simple: public interface IClosableViewModel : IViewModel { event EventHandler CloseView; } Processing the ChildWindow is also trivial. When the ViewModel raises the CloseView event, the Close method of the ChildWindow gets called. Tis allows you to indirectly connect the ViewModel to the View: // Close the ChildWindow if the ViewModel requests it. var closableViewModel = viewModel as IClosableViewModel; if (closableViewModel != null) { closableViewModel.CloseView += (s, e) => { window.Close(); }; } // Show the window. window.Show(); If the View isn’t a ChildWindow, then it should simply be made available in the IAsyncResult. Tis will show the View in the Frame. Tere. Now you’ve seen the whole process of how the View and ViewModel are constructed. Using the Example Code Te code download for this article contains an MVVM application using this type of custom navigation with MEF. Te solution contains the following examples: • Navigating to a regular UserControl • Navigating to a regular UserControl by passing arguments (.../#Employee/DiMattia) • Navigating to a ChildWindow by passing arguments (.../#Customer/1) • Imports of the INavigationService, the IDataService, ... • Examples of ViewExport and ViewModelExport confguration Tis article should have provided a good idea of how the sample works. For a deeper understanding, play with the code and customize it for your own applications. You’ll see how powerful and fexible MEF can be. „ SA NDRINO DI MATTIA is a sofware engineer at RealDolmen and has a passion for everything that is Microsof. He also participates in user groups and writes articles on his blog at blog.sandrinodimattia.net. THANKS to the following technical experts for reviewing this article: Glenn Block and Daniel Plaisted Now you’ve seen the whole process of how the View and ViewModel are constructed. Untitled-3 2 12/2/10 10:47 AM Untitled-3 3 12/2/10 10:47 AM msdn magazine 48 PARALLEL COMPUTI NG Data Processing: Parallelism and Performance Processing data collections is a fundamental computing task, and a number of practical problems are inherently parallel, potentially enabling improved performance and throughput on multicore systems. I’ll compare several distinct Windows-based methods to solve problems with a high degree of data parallelism. Te benchmark I’ll use for this comparison is a search problem (“Geo- names”) from Chapter 9 of Troy Magennis’ book, “LINQ to Objects Using C# 4.0” (Addison-Wesley, 2010). Te alternative solutions are: • Parallel Language Integrated Query (PLINQ) and C# 4.0, with and without enhancements to the original code. • Windows native code using C, the Windows API, threads and memory-mapped fles. • Windows C#/Microsof .NET Framework multithreaded code. The source code for all solutions is available on my Web site (jmhartsoftware.com). Other parallelism techniques, such as the Windows Task Parallel Library (TPL), are not examined directly, although PLINQ is layered on the TPL. Johnson M. Hart Comparing and Evaluating Alternative Solutions Te solution evaluation criteria, in order of importance, are: • Total performance in terms of elapsed time to complete the task. • Scalability with the degree of parallelism (number of tasks), core count and data collection size. • Code simplicity, elegance, ease of maintenance and similar intangible factors. Results Summary Tis article will show that for a representative benchmark search problem: • You can successfully exploit multicore 64-bit systems to improve performance in many data-processing problems, and PLINQ can be part of the solution. • Competitive, scalable PLINQ performance requires indexed data collection objects; supporting the IEnumerable inter- face isn’t suf cient. • Te C#/.NET and native code solutions are fastest. • Te original PLINQ solution is slower by a factor of nearly 10, and it doesn’t scale beyond two tasks, whereas the other solutions scale well up to six tasks with six cores (the maxi- mum tested). However, code enhancements improve the original solution signifcantly. • The PLINQ code is the simplest and most elegant in all respects because LINQ provides a declarative query capabil- ity for memory-resident and external data. Te native code is ungainly, and the C#/.NET code is considerably better than—but not as simple as—the PLINQ code. • All methods scale well with fle size up to the limits of the test system physical memory. This article discusses: • Evaluation criteria for alternative parallelism strategies • Optimizing a benchmark data search problem • A performance comparison of different approaches Technologies discussed: PLINQ, C#, Microsoft .NET Framework Code download available at: jmhartsoftware.com 49 January 2011 msdnmagazine.com The Benchmark Problem: Geonames Te idea for this article came from Chapter 9 of Magennis’ LINQ book, which demonstrates PLINQ by searching a geographic data- base containing more than 7.25 million place names in an 825MB fle (that’s more than one location for every 1,000 people). Each place name is represented by a UTF-8 (en.wikipedia.org/wiki/UTF-8) text line record (variable length) with more than 15 tab-separated data columns. Note: Te UTF-8 encoding assures that a tab (0x9) or line feed (0xA) value won’t occur as part of a multi-byte sequence; this is essential for several implementations. Magennis’ Geonames program implements a hard-coded query to identify all locations with an elevation (column 15) greater than 8,000 meters, displaying the location name, country and elevation, sorted by decreasing elevation. In case you’re wondering, there are 16 such locations, and Mt. Everest is the highest at 8,848 meters. Magennis reports elapsed times of 22.3 seconds (one core) and 14.1 seconds (two cores). Previous experience (for example, see my article “Windows Parallelism, Fast File Searching and Speculative Processing” at informit.com/articles/article.aspx?p=1606242) shows that fles of this size can be processed in a few seconds and performance scales well with the core count. Terefore, I decided to attempt to replicate that experience and also try to enhance Magennis’ PLINQ code for better performance. The initial PLINQ enhancements almost doubled the performance but didn’t improve scalability; further enhancements, however, do yield performance nearly as good as native and C# multithreaded code. Tis benchmark is interesting for several reasons: • Te subject (geographical places and attributes) is inherently interesting, and it’s easy to generalize the query. • Tere’s a high degree of data parallelism; in principle, every record could be processed concurrently. • Te fle size is modest by today’s standards, but it’s simple to test with larger fles simply by concatenating the Geonames allCountries.txt fle to itself multiple times. • The processing is not stateless; it’s necessary to deter- mine the line and field boundaries in order to partition the fle, and the lines need to be processed to identify the individual felds. An Assumption: Assume that the number of identifed records (in this instance, locations higher than 8,000 meters) is small, so that sorting and display time are minimal relative to the overall processing time, which involves examining every byte. Another Assumption: Te performance results represent time required to process memory-resident data collections, such as data produced by a prior program step. Te benchmark program does read a fle, but the test programs are run several times to assure that the fle is memory-resident. However, I’ll mention the time required to load the fle initially, and this time is approximately the same for all solutions. Performance Comparison Te frst test system is a six-core desktop system running Windows 7 (AMD Phenom II, 2.80 GHz, 4GB RAM). Later, I’ll present results for three other systems with hyper-threading, or HT (en.wikipedia.org/wiki/Hyper-threading), and diferent core counts. Figure 1 shows the results for six diferent Geonames solutions, with elapsed time (seconds) as a function of “Degree of Parallel- ization,” or DoP (the number of parallel tasks, which can be set to higher than the number of processors); the test system has six cores, but the implementations control the DoP. Six tasks are optimal; using more than six tasks degraded performance. All tests use the original Geonames 825MB allCountries.txt data fle. Te implementations are (fuller explanations will follow): 1. Geonames Original. This is Magennis’ original PLINQ solution. Te performance isn’t competitive and doesn’t scale with processor count. 2. Geonames Helper. Tis is a performance-enhanced version of Geonames Original. Figure 1 Geonames Performance as a Function of Degree of Parallelism 1 2 3 4 5 6 7 8 18 16 14 12 10 8 6 4 2 0 T i m e ( s ) Degree of Parallelism (DoP) Original Helper MMChar MMByte Index Multithreaded Figure 2 Geonames Performance as a Function of File Size Index Native .NET Th 9.00 8.00 7.00 6.00 5.00 4.00 3.00 2.00 1.00 0.00 825MB 1,651MB 3,302MB A number of practical problems are inherently parallel. msdn magazine 50 Parallel Computing 3. Geonames MMChar. Tis was an unsuccessful attempt to enhance Geonames Helper with a memory-mapped fle class similar to that used in Geonames Treads. Note: Memory map- ping allows a fle to be referenced as if it was in memory without explicit I/O operations, and it can provide performance benefts. 4. Geonames MMByte. Tis solution modifes MMChar to process the individual bytes of the input fle, whereas the previous three solutions convert UTF-8 characters to Unicode (at 2 bytes each). The performance is the best of the first four solutions and is more than double that of Geonames Original. 5. Geonames Treads doesn’t use PLINQ. Tis is the C#/.NET implementation using threads and a memory-mapped fle. Te performance is faster than Index (the next item) and about the same as Native. Tis solution and Geonames Native provide the best parallelism scalability. 6. Geonames Index. Tis PLINQ solution preprocesses the data fle (requiring about nine seconds) to create a memory- resident List<byte[]> object for subsequent PLINQ processing. Te preprocessing cost can be amortized over multiple queries, giving performance that’s only slightly slower than Geonames Native and Geonames Treads. 7. Geonames Native (not shown in Figure 1) doesn’t use PLINQ. This is the C Windows API implementation using threads and a memory-mapped fle as in Chapter 10 of my book, “Windows System Programming” (Addison- Wesley, 2010). Full compiler optimization is essential for these results; the default optimization provides only about half the performance. All implementations are 64-bit builds. 32-bit builds work in most cases, but they fail for larger fles (see Figure 2). Figure 2 shows performance using DoP 4 and larger fles. The test system in this case has four cores (AMD Phenom Quad-Core, 2.40 GHz, 8GB RAM). Te larger fles were created by concatenating multiple copies of the original fle. Figure 2 shows just the three fastest solutions, including Geonames Index—the fastest PLINQ solution (not counting fle preprocessing)—and performance scales with the fle size up to the limits of physical memory. I’ll now describe implementations two through seven and discuss the PLINQ techniques in more detail. Following that, I’ll discuss results on other test systems and summarize the fndings. The Enhanced PLINQ Solutions: Geonames Helper Figure 3 shows Geonames Helper with my changes (in bold face) to the Geonames Original code. As many readers may not be familiar with PLINQ and C# 4.0, I’ll provide a few comments about Figure 3, including descriptions of the enhancements: • Lines 9-14 allow the user to specify the input file name and the degree of parallelism (the maximum number of concurrent tasks) on the command line; these values are hardcoded in the original. • Lines 16-17 start to read the fle lines asynchronously and implicitly type lines as a C# String array. Te lines values aren’t used until Lines 19-27. Other solutions, such as Geonames MMByte, use a diferent class with its own Read- Lines method, and these code lines are the only lines that need to be changed. • Lines 19-27 are LINQ code along with the PLINQ AsParallel extension. Te code is similar to SQL, and the variable “q” is implicitly typed as an array of objects consisting of an integer elevation and a String. Notice that PLINQ per- 1 class Program 2 { 3 static void Main(string[] args) 4 { 5 const int nameColumn = 1; 6 const int countryColumn = 8; 7 const int elevationColumn = 15; 8 9 String inFile = "Data/AllCountries.txt"; 10 if (args.Length >= 1) inFile = args[0]; 11 12 int degreeOfParallelism = 1; 13 if (args.Length >= 2) degreeOfParallelism = int.Parse(args[1]); 14 Console.WriteLine("Geographical data file: {0}. Degree of Parallelism: {1}.", inFile, degreeOfParallelism); 15 16 var lines = File.ReadLines(Path.Combine( 17 Environment.CurrentDirectory, inFile)); 18 19 var q = from line in lines.AsParallel().WithDegreeOfParallelism(degreeOfParallelism) 20 let elevation = Helper.ExtractIntegerField(line, elevationColumn) 21 where elevation > 8000 // elevation in meters 22 orderby elevation descending 23 select new 24 { 25 elevation = elevation, 26 thisLine = line 27 }; 28 29 foreach (var x in q) 30 { 31 if (x != null) 32 { 33 String[] fields = x.thisLine.Split(new char[] { '\t' }); 34 Console.WriteLine("{0} ({1}m) - located in {2}", 35 fields[nameColumn], fields[elevationColumn], fields[countryColumn]); 36 } 37 } 38 } 39 } Figure 3 Geonames Helper with Highlighted Changes to the Original PLINQ Code You can successfully exploit multicore, 64-bit systems to improve performance in many data-processing problems, and PLINQ can be part of the solution. Untitled-3 1 12/7/10 3:45 PM © 1987-2010 ComponentOne LCC. All rights reserved. All other product and brand names are trademarks and/or registered trademarks of their respective holders. Untitled-4 2 11/8/10 2:58 PM Untitled-4 3 11/8/10 2:58 PM msdn magazine 54 Parallel Computing forms all the thread-management work; the AsParallel method is all that’s needed to turn serial LINQ code into PLINQ code. • Line 20. Figure 4 shows the Helper.ExtractIntegerField method. Te original program uses the String.Split method in a manner similar to that used to display the results in Line 33 (Figure 3). Tis is the key to the improved perfor- mance of Geonames Helper compared to Geonames Original, as it’s no longer necessary to allocate String objects for every feld in every line. Note that the AsParallel method used in Line 19 can be used with any IEnumerable object. As I mentioned earlier, Figure 4 shows the Helper class Extract IntegerField method. It simply extracts and evaluates the specifed feld (elevation in this case), avoiding library methods for performance. Figure 1 shows that this enhancement doubles the performance with DoP 1. Geonames MMChar and Geonames MMByte Geonames MMChar represents an unsuccessful attempt to improve performance by memory mapping the input fle, using a custom class, FileMmChar. Geonames MMByte, however, does yield sig- nifcant benefts, as the input fle bytes aren’t expanded to Unicode. MMChar requires a new class, FileMmChar, which supports the IEnumerable<String> interface. Te FileMmByte class is similar and deals with byte[] objects rather than String objects. Te only signifcant code change is to Figure 3, Lines 16-17, which are now: var lines = FileMmByte.ReadLines(Path.Combine( Environment.CurrentDirectory, inFile)); Te code for public static IEnumerable<byte[]> ReadLines(String path) that supports the IEnumerable<byte[]> interface in FileMmByte constructs a FileMmByte object and an IEnumerator<byte[]> object that scans the mapped fle for individual lines. Note that the FileMmChar and FileMmByte classes are “unsafe” because they create and use pointers to access the fles, and they use C#/native code interoperability. All the pointer usage is, however, isolated in a separate assembly, and the code uses arrays rather than pointer dereferencing. Te .NET Framework 4 MemoryMappedFile class doesn’t help, because it’s necessary to use accessor functions to move data from mapped memory. Geonames Native Geonames Native exploits the Windows API, threads and fle mem- ory mapping. Te basic code patterns are described in Chapter 10 of “Windows System Programming.” Te program must manage the threads directly and must also carefully map the fle to memory. Te performance is much better than all PLINQ implementations except Geonames Index. Tere is, however, an important distinction between the Geo- names problem and a simple, stateless fle search or transformation. The challenge is to determine the correct method to partition the input data so as to assign diferent partitions to diferent tasks. Tere’s no obvious way to determine the line boundaries without scanning the entire fle, so it isn’t feasible to assign a fxed-size par- tition to each task. However, the solution is straightforward when illustrated with DoP 4: • Divide the input file into four equal partitions, with the partition start location communicated to each thread as part of the thread function argument. • Have each thread then process all lines (records) that start in the partition. Tis means that a thread will probably scan into the next partition in order to complete processing of the last line that starts in the partition. Geonames Threads Geonames Treads uses the same logic as Geonames Native; in fact some code is the same or nearly the same. However, lambda expres- sions, extension methods, containers and other C#/.NET features greatly simplify the coding. As with MMByte and MMChar, the file memory mapping requires “unsafe” classes and C#/native code interoperability in order to use pointers to the mapped memory. Te efort is worth- while, however, because Geonames Threads performance is the same as Geonames Native performance with much simpler code. Geonames Index Te PLINQ results (Original, Helper, MMChar and MMByte) are disappointing compared to the Native and .NET Treads results. Is 41 class Helper 42 { 43 public static int ExtractIntegerField(String line, int fieldNumber) 44 { 45 int value = 0, iField = 0; 46 byte digit; 47 48 // Skip to the specified field number and extract the decimal value. 49 foreach (char ch in line) 50 { 51 if (ch == '\t') { iField++; if (iField > fieldNumber) break; } 52 else 53 { 54 if (iField == fieldNumber) 55 { 56 digit = (byte)(ch - 0x30); // 0x30 is the character '0' 57 if (digit >= 0 && digit <= 9) { value = 10 * value + digit; } 58 else // Character not in [0-9]. Reset the value and quit. 59 { value = 0; break; } 60 } 61 } 62 } 63 return value; 64 } 65 } Figure 4 Geonames Helper Class and ExtractIntegerField Method The challenge is to determine the correct method to partition the input data so as to assign different partitions to different tasks. Untitled-1 1 11/10/10 3:05 PM msdn magazine 56 Parallel Computing there a way to exploit the simplicity and elegance of PLINQ with- out sacrifcing performance? Although it’s impossible to determine exactly how PLINQ pro- cesses the query (Lines 16-27 in Figure 3), it’s probable that PLINQ has no good way to partition the input lines for parallel processing by separate tasks. As a working hypothesis, assume that partitioning may be the cause of the PLINQ performance problem. From Magennis’ book (pp. 276-279), the lines String array supports the IEnumerable<String> interface (see also John Sharp’s book, “Microsoft Visual C# 2010 Step by Step” [Microsoft Press, 2010], Chapter 19). However, lines isn’t indexed, so PLINQ probably uses “chunk partitioning.” Furthermore, the IEnumerator.MoveNext methods for the FileMmChar and FileMmByte classes are slow because they need to scan every character until the next new line is located. What would happen if the lines String array were indexed? Could we improve PLINQ performance, especially when supple- mented by memory mapping the input fle? Geonames Index shows that this technique does improve performance, yielding results comparable to native code. In general, however, either there’s a necessary up-front cost to move the lines into an in-memory list or array, which is indexed (the cost can then be amortized over multiple queries), or the fle or other data source is already indexed, perhaps during generation in an earlier program step, eliminating the preprocessing cost. Te up-front indexing operation is simple; just access each line, one at a time, and then add the line to a list. Use the list object in Lines 16-17, as illustrated in Figure 3, and in this code snippet, which shows the preprocessing: // Preprocess the file to create a list of byte[] lines List<byte[]> lineListByte = new List<byte[]>(); var lines = FileMmByte.ReadLines(Path.Combine(Environment.CurrentDirectory, inFile)); // ... Multiple queries can use lineListByte // .... foreach (byte[] line in lines) { lineListByte.Add(line); } // .... var q = from line in lineListByte.AsParallel(). WithDegreeOfParallelism(degreeOfParallelism) Note that it’s slightly more ef cient to process the data further by converting the list to an array, although this increases the preprocessing time. A Final Performance Enhancement Geonames Index performance can be improved still further by indexing the felds in each line so that the ExtractIntegerField method doesn’t need to scan all the characters in a line out to the specifed feld. Te implementation, Geonames IndexFields, modifes the Read- Lines method so that a returned line is an object containing both a byte[] array and a uint[] array containing the locations of each feld. Tis results in about a 33 percent performance improvement over Geonames Index and brings the performance fairly close to the native and C#/.NET solutions. (Geonames IndexFields is included with the code download.) Furthermore, it’s now much easier to construct more general queries as the individual felds are readily available. Limitations Te ef cient solutions all require memory-resident data, and the performance advantages don’t extend to very large data collections. “Very large” in this case refers to data sizes that approach the system physical memory size. In the Geonames example, the 3,302MB file (four copies of the original file) could be processed on the 8GB test system. A test with eight concatenated copies of the fle, however, was very slow with all the solutions. As mentioned earlier, performance is also best when the data fles are “live,” in the sense that they’ve been accessed recently and are likely to be in memory. Paging in the data fle during the initial run can take 10 or more seconds and is comparable to the indexing operation in the earlier code snippet. In summary, the results in this article apply to memory-resident data structures, and today’s memory sizes and prices allow signifcant data objects, such as a fle of 7.25 million place names, to be memory-resident. Additional Test System Results Figure 5 shows test results on an additional system (Intel i7 860, 2.80GHz, four cores, eight threads, Windows 7, 4GB RAM). Te processor supports hyper-threading, so the tested DoP values are 1, 2, ..., 8. Figure 1 is based on a test system with six AMD cores; this system doesn’t support hyper-threading.. Two additional test confgurations produced similar results (and the full data is available on my Web site): • Intel i7 720, 1.60GHz, four cores, eight threads, Windows 7, 8GB RAM • Intel i3 530, 2.93GHz, two cores, four threads, Windows XP64, 4GB RAM Interesting performance characteristics include: • Geonames Treads consistently provides the best perfor- 1 2 3 4 5 6 7 8 16 14 12 10 8 6 4 2 0 T i m e ( s ) Degree of Parallelism (DoP) Original Helper MMChar MMByte Index .NET Th Figure 5 Intel i7 860, 2.80GHz, Four Cores, Eight Threads, Windows 7, 4GB RAM PLINQ provides an excellent model for processing in-memory data structures. 57 January 2011 msdnmagazine.com mance, along with Geonames Native. • Geonames Index is the fastest PLINQ solution, app roaching the Geonames Threads performance. Note: Geonames IndexFields is slightly faster but isn’t shown in Figure 5. • Other than Geonames Index, all PLINQ solutions scale negatively with the DoP for DoP greater than two; that is, performance decreases as the number of parallel tasks increases. From this example, PLINQ produces good performance only when used with indexed objects. • Te hyper-threading performance contribution is marginal. Terefore, Geonames Treads and Geonames Index perfor- mance doesn’t increase signifcantly for DoP greater than four. Tis poor HT scalability might be a consequence of scheduling two threads on logical processors on the same core, rather than assuring that they run on distinct cores when possible. However, this explanation doesn’t appear to be plausible as Mark E. Russinovich, David A. Solomon and Alex Ionescu say that physical processors are sched- uled before logical processors on p. 40 of their book, “Windows Internals, Fifh Edition” (Microsof Press, 2009). Te AMD systems without HT (Figure 1) provided about three to four times the performance with DoP greater than four, compared to the sequential DoP one results for Treads, Native and Index. Figure 1 shows that the best performance occurs when the DoP is the same as the core count, where the multithreaded performance is 4.2 times the DoP one performance. Results Summary PLINQ provides an excellent model for processing in-memory data structures, and it’s possible to improve existing code performance with some simple changes (for example, Helper) or with more advanced techniques, as was shown with MMByte. However, none of the simple enhancements provide performance close to that of native or multithreaded C#/.NET code. Furthermore, the enhance- ment doesn’t scale with the core count and DoP. PLINQ can come close to native and C#/.NET code performance, but it requires the use of indexed data objects. Using the Code and Data All code is available on my Web site (jmhartsoftware.com/SequentialFile ProcessingSupport.html), following these instructions: • Go to the page to download a ZIP file containing PLINQ code and Geonames Treads code. All PLINQ variations are in the GeonamesPLINQ project (Visual Studio 2010; Visual Studio 2010 Express is sufficient). Geonames Threads is in the GeonamesThreads Visual Studio 2010 project. The projects are both configured for 64-bit release builds. The ZIP file also contains a spreadsheet with the raw performance data used in Figures 1, 2 and 5. A simple “Usage” comment at the head of the fle explains the command-line option to select the input file, DoP and implementation. • Go to the Windows System Programming support page (jmhartsoftware.com/comments_updates.html) to download the Geonames Native code and projects (a ZIP fle), where you’ll find the Geonames project. A ReadMe.txt file explains the structure. • Download the GeoNames database from download.geonames.org/ export/dump/allCountries.zip. Unexplored Questions This article compared the performance of several alternative techniques to solve the same problem. The approach has been to use standard interfaces as they’re described and to assume a simple shared-memory model for the processors and threads. Tere was, however, no signifcant efort to probe deeply into the underlying implementations or specific features of the test machines, and numerous questions could be investigated in the future. Here are some examples: • What’s the efect of cache misses, and is there any method to reduce the impact? • What would be the impact of solid-state disks? • Is there any way to reduce the performance gap between the PLINQ Index solution and the Threads and Native solutions? Experiments reducing the amount of data copy- ing in the FileMmByte IEnumerator.MoveNext and Current methods didn’t show any signifcant beneft. • Is the performance close to the theoretical maximum as determined by memory bandwidth, CPU speed and other architectural features? • Is there a way to get scalable performance on HT systems (see Figure 5) that’s comparable to systems without HT (Figure 1)? • Can you use profling and Visual Studio 2010 tools to identify and remove performance bottlenecks? I hope you’re able to investigate further. „ JOHNSON (JOHN) M. HART is a consultant specializing in Microsof Windows and Microsoft .NET Framework application architecture and development, technical training and writing. He has many years of experience as a sofware engineer, engineering manager and architect at Cilk Arts Inc. (since acquired by Intel Corp.), Sierra Atlantic Inc., Hewlett-Packard Co. and Apollo Computer. He served as computer science professor for many years and has authored four editions of “Windows System Programming” (Addison-Wesley, 2010). THANKS to the following technical experts for reviewing this article: Michael Bruestle, Andrew Greenwald, Troy Magennis and CK Park None of the simple enhancements provide performance close to that of native or multithreaded C#/.NET code. msdn magazine 58 VI SUAL STUDI O Use Multiple Visual Studio Project Types for Cloud Success As you’ve probably noticed, there are many diferent project types in Visual Studio these days. Which one do you choose? All have strengths that help solve problems in diferent situations. Even within a single business problem, there are ofen multiple use cases that can best be solved by diferent Visual Studio project types. I was confronted with a real-world example of such a problem recently while building out the infrastructure for a cloud-based pro- gram I lead that’s designed to highlight success stories: Microsof Solutions Advocates (microsoftsolutionsadvocates.com). I used several diferent Visual Studio project types to build my solution, and in this article, I’ll walk through a simplifed example of my project called “Customer Success Stories,” or CSS. Patrick Foley CSS has three distinct use cases: 1. Anonymous users read success stories on a public Web site. 2. Users who belong to the program log in to a private Web site to create and edit their own success stories. 3. Administrators (like me) log in to an administrative Web site to manage and edit all the data, including minutia such as lookup tables. The approach I settled on combined three Microsoft .NET Framework technologies: 1. ASP.NET MVC for the public site 2. WCF RIA Services for the private, customer-edit site 3. ASP.NET Dynamic Data for the administrative site Any of these technologies could’ve been used to create the whole solution, but I preferred to take advantage of the best features of each. ASP.NET MVC is an ideal technology for creating a public Web site that will work everywhere, because it emits standard HTML, for one thing. Te public site has a marketing purpose, so I’ll even- tually engage a designer to polish the appearance. Working with a designer adds complexity, but ASP.NET MVC has a straightforward view implementation that makes it easy to incorporate a designer’s vision. Making the site read-only and separating it from the other use cases helps isolate the scope of the designer’s involvement. Although ASP.NET MVC could also be used to implement the customer-editing functionality, WCF RIA Services is an even better ft. (Conversely, WCF RIA Services could be used to build a great-looking public Web site, but Silverlight isn’t supported on some devices, such as iPhones and iPads, and I wanted the greatest This article discusses: • Creating an Entity Framework data model • Creating an ASP.NET Dynamic Data project • Creating a Windows Azure service project • Creating an ASP.NET MVC project • Creating a WCF RIA Services project • Adding the final touches for a complete solution Technologies discussed: ASP.NET MVC, WCF RIA Services, ASP.NET Dynamic Data Code download available at: code.msdn.microsoft.com/mag201101VSCloud 59 January 2011 msdnmagazine.com reach for the public use case.) Silverlight is here to stay, and it’s perfect for creating a rich editing experience with very little pro- gramming, so long as it’s reasonable to expect users to have it or install it, as would be the case with customers collaborating on a success-stories site. ASP.NET Dynamic Data provides a handy way to build an administrative solution without too much work. Te administrative site doesn’t need to be fancy; it simply needs to provide a way to manage all of the data in the solution without having to resort to SQL Server Management Studio. As my solution evolves, the ASP.NET Dynamic Data site could conceivably be subsumed by the WCF RIA Services site. Nevertheless, it’s useful at the beginning of a data-centric development project such as this one, and it costs almost nothing to build. Targeting Windows Azure Again, this example is based on a real-world problem, and because the solution requires a public Web site, I’m going to target Windows Azure and SQL Azure. Windows Server and SQL Server might be more familiar, but I need the operational benefits of running in the cloud (no need to maintain the OS, apply patches and so on). I barely have time to build the solution—I certainly don’t have time to operate it, so Windows Azure is a must for me. To work through this example and try it on Windows Azure, you need an account. Tere are various options and packages found at microsoft.com/windowsazure/offers. MSDN subscribers and Microsof part- ners (including BizSpark startups—visit bizspark.com to learn more) have access to several months of free resources. For prototyping and learning (such as working through this example), you can use the “Introductory Special.” It includes three months of a 1GB SQL Azure database and 25 hours of a Windows Azure small compute instance, which should be enough to familiarize yourself with the platform. I built this example using the Introductory Special without incurring any additional charges. ‘Customer Success Stories’ Overview Tis example is presented in the form of a tutorial. Several impor- tant aspects of a real-world implementation are out of scope for this article, including user-level security, testing, working with a designer and evolving beyond an extremely simplifed model. I’ll attempt to address these in future articles or on my blog at pfoley.com. Te steps are presented at a high level and assume some familiarity with Visual Studio and with the tech- nologies involved. Code for the entire solution can be downloaded from code.msdn.microsoft.com/mag201101VSCloud, and click-by-click instructions for each step can be found at pfoley.com/mm2011jan. Step 1: Create a Project for the Entity Framework Data Model Te Web technologies used in this example all use the Entity Framework efectively, so I chose to integrate the three use cases by having them all share a common Entity Framework model. When working with an existing database, you have to generate the model from the database. Whenever possible, I prefer to create the model frst and generate the database from it, because I like to think about my design more at the model level than the database level. To keep things simple, this example uses just two entities: Company and Story. Figure 1 Adding Company and Story Entities to the Visual Studio Project Figure 2 Configuring Settings in the SQL Azure Portal msdn magazine 60 Visual Studio Te Entity Framework model will be created in its own project and shared across multiple projects (I learned how to do this from Julie Lerman; see pfoley.com/mm2011jan01 for more on that). I call best practices like these “secret handshakes”—fguring it out the frst time is a challenge, but once you know the secret, it’s simple: 1. Create the Entity Framework model in a “class library” project. 2. Copy the connection string into any projects that share the model. 3. Add a reference to the Entity Framework project and System.Data.Entity in any projects that share the model. To start, create a Blank Solution in Visual Studio 2010 named “Customer Success Stories” and add a Class Library project named “CSSModel.” Delete the class file and add an empty ADO.NET Entity Data Model item named “CSSModel.” Add Company and Story entities with an association between them as in Figure 1 (when you right-click on Company to add the association, make sure that “Add foreign key properties to ‘Person’ Entity” is checked on the ensuing dialog—foreign key properties are required in future steps). Te model is now ready to gener- ate database tables, but a SQL Azure database is needed to put them in. When prototyping is completed and the project is evolving, it’s useful to add a local SQL Server database for testing purposes, but at this point, it’s less complicated to work directly with a SQL Azure database. From your SQL Azure account portal, create a new database called “CSSDB” and add rules for your current IP address and to “Allow Microsof Services access to this server” on the Firewall Settings tab. Your SQL Azure account portal should look something like Figure 2. In Visual Studio, right-click on the design surface and select “Generate Database from Model.” Add a connection to your new SQL Azure database and complete the wizard, which generates some Data Defnition Language (DDL) and opens it in a .sql fle, as shown in Figure 3. Before you can execute the SQL, you must connect to the SQL Azure database (click the Connect button on the Transact-SQL Editor toolbar). The “USE” statement is not supported in SQL Azure, so you must choose your new database from the Database dropdown on the toolbar and then execute the SQL. Now you have a SQL Azure database that you can explore in Visual Studio Server Explorer, SQL Server Management Studio or the new management tool, Microsof Project Code- Named “Houston” (sqlazurelabs.com/houston.aspx). Once you build the solution, you have an Entity Framework project that you can use to access that database pro- grammatically, as well. Step 2: Create the ASP.NET Dynamic Data Project An ASP.NET Dynamic Data Web site provides a simple way to work with all the data in the database and establishes baseline functionality to ensure the envi- ronment is working properly—all with one line of code. Add a new ASP.NET Dynamic Data Entities Web App- lication project to the solution and call it “CSSAdmin.” To use the data model from the first step, copy the connectionStrings element from App.Config in CSSModel to web.confg in CSSAdmin. Set CSSAdmin as the startup project and add references to the CSSModel project and System.Data.Entity. Tere are lots of fancy things you can do with ASP.NET Dynamic Data projects, but it’s surprisingly useful to Figure 4 Creating Services in the Windows Azure Portal Figure 3 Generating the Database Model ENTERPRISE SNMP POP TCP UDP 2IP SSL SFTP SSH HTTP TELNET EMULATION FTP SMTP WEB UI Internet Connectivity for the Enterprise PowerSNMP for ActiveX and .NET Create custom Manager, Agent and Trap applications with a set of native ActiveX, .NET and Compact Framework components. SNMPv1, SNMPv2, SNMPv3 (authentication/encryption) and ASN.1 standards supported. Since 1994, Dart has been a leading provider of high quality, high performance Internet connectivity components supporting a wide range of protocols and platforms. Dart’s three product lines offer a comprehensive set of tools for the professional software developer. PowerWEB for ASP.NET AJAX enhanced user interface controls for responsive ASP.NET applications. Develop unique solutions by including streaming file upload and interactive image pan/zoom functionality within a page. Download a fully functional product trial today! Ask us about Mono Platform support. Contact [email protected]. PowerTCP for ActiveX and .NET Add high performance Internet connectivity to your ActiveX, .NET and Compact Framework projects. Reduce integration costs with detailed documentation, hundreds of samples and an expert in-house support staff. SSH UDP TCP SSL FTP SFTP HTTP POP SMTP IMAP S/MIME Ping DNS Rlogin Rsh Rexec Telnet VT Emulation ZIP Compression more... Untitled-1 1 1/11/10 11:10 AM msdn magazine 62 Visual Studio implement the default behavior that comes by simply uncommenting the RegisterContext line in Global.asax.cs and changing it to: DefaultModel.RegisterContext(typeof(CSSModel.CSSModelContainer), new ContextConfiguration() { ScaffoldAllTables = true }); Build and run the project, and you have a basic site to manage your data. Add some test data to make sure everything’s working. Step 3: Create the Windows Azure Services Project Te result of the previous step is a local Web site that accesses a database on SQL Azure—the next step is it to get that Web site running on Windows Azure. From your Windows Azure account portal, create a Storage Service called “CSS Storage” and a Hosted Service called “CSS Service.” Your Windows Azure account portal should look similar to Figure 4. In Visual Studio, add a new Windows Azure Cloud Service project to your solution called “CSSAdmin- Service” (you must have Windows Azure Tools for Visual Studio installed), but don’t add additional “cloud service solutions” from the wizard. Te cloud service project adds the infrastructure necessary to run your application in a local version of the “cloud fabric” for development and debugging. It also makes it easy to publish to Windows Azure interactively. Tis is great for prototyping and for simpler Windows Azure solutions, but once you get serious about develop- ing on Windows Azure, you’ll probably want to use Windows PowerShell to script deployment, perhaps as part of a continuous integration solution. Right-click on the Roles folder in CSSAdmin Service, then select “Add | Web Role Project in solution” to associate the CSSAdmin project with the cloud service project. Now when you compile and run the solution, it runs in the development fabric. At this point, the solution doesn’t look any diferent than it did running on IIS or Cassini, but it’s important to run on the dev fabric anyway to catch mistakes such as using unsupported Windows APIs as you evolve a Windows Azure solution. Deploy to your Windows Azure account by right- clicking on the CSSAdminService project and selecting Publish. Te frst time you do this, you’ll need to add credentials (follow the instructions to copy a certif- cate to your Windows Azure account). Ten select a “Hosted Service Slot” and a “Storage Account” to deploy your solution to. Tere are two options for the hosted service slot: production and staging. When updating a real-world solution in production, deploy frst to staging to make sure everything works and then promote the staging environment into production. While prototyp- ing, I prefer to deploy straight to production because I’m not going to leave the solution running anyway. Click OK to deploy to Windows Azure, which can take several minutes. Once complete, run your Windows Azure application using the Web site URL shown on the service page, which should look similar to Figure 5. Afer verifying that the service works, suspend and delete the deployment to avoid charges (consumption-based plans are billed for any time a service package is deployed, whether or not it’s actu- ally running). Don’t delete the service itself, because doing so returns the Web site URL back into the pool of available URLs. Obviously, when you’re ready to fip the switch on a real production solution, you’ll have to budget for running Windows Azure services nonstop. When a service deployment fails, it doesn’t always tell you exactly what’s wrong. It usually doesn’t even tell you something is wrong. Figure 5 A Windows Azure Deployed Service Figure 6 Enabling IntelliTrace While Publishing to Windows Azure Project3 12/16/09 11:55 AM Page 1 msdn magazine 64 Visual Studio Te service status just enters a loop such as “Initializing … Busy … Stopping … Initializing …” When this happens to you—and it will—look for problems such as attempting to access local resources (perhaps a local SQL Server database) or referencing assemblies that don’t exist on Windows Azure. Enabling IntelliTrace when you deploy the package (see Figure 6) can help you pinpoint problems by identifying the specifc excep- tions that are being thrown. Step 4: Create the ASP.NET MVC Project Te solution so far consists of an administrative Web site (albeit with no user-level security) that runs on Windows Azure and accesses a SQL Azure database, all with one line of code. Te next step is to create the public Web site. Add a new ASP.NET MVC 2 Web Application project to the solution named “CSSPublic” (don’t create a unit test project while working through this example). If you’re already quite experienced with ASP.NET MVC, you might prefer to start with an ASP.NET MVC 2 Empty Web Application, but I prefer to start from a Web site structure that already works and modify it bit by bit to make it do what I want. Right-click on CSSPublic to make it your startup proj- ect, and then run it to see what you’re starting with. Te public site for CSS is read-only and anonymous, so remove all login and account functionality with these steps: 1. Delete the “logindisplay” div from Site.Master. 2. Delete the ApplicationServices connection string and authen- tication element from the main Web.confg. 3. Delete AccountController.cs, AccountModels.cs, LogOnUserControl.ascx and the entire Account folder under Views. 4. Run it again to make sure it still works. 5. Copy the connection string from the CSSModel App.Confg into the CSSPublic Web.confg and add references to CSSModel and System.Data.Entity as before. 6. Select all the references for CSSPublic and set the Copy Local property to true. I think it makes sense to add separate controllers (with associated views) for Companies and Stories, while keeping the Home controller as a landing page. Tese are important decisions; a designer can make the site look good, but the information architecture—the site structure—has to be right frst. Naming is relevant in a Model View Controller (MVC) project. Use plural controller names (Com- panies, not Company) and matching view folders. Use the standard conventions of Index, Details and so on for controller methods and view names. You can tweak these conventions if you really want to, but that adds complexity. Right-click on the Controllers folder to add a new controller named “CompaniesController.” This site won’t implement any behavior—it’s read-only—so there’s no need for an explicit model class. Treat the Entity Framework model container itself as the model. In Companies Controller.cs, add “using CSSModel” and change the Index method to return a list of companies as follows: Figure 8 Adding a Domain Service Class Figure 7 A List of Companies in ASP.NET MVC Web Site 65 January 2011 msdnmagazine.com CSSModelContainer db = new CSSModelContainer(); return View(db.Companies.ToList()); To create the view, create an empty Companies folder under Views and right-click on it to add a view called “Index.” Make it strongly typed with a View data class of CSSModel.Company and View content of List. In Site.Master, add a list item in the menu to reference the new Controller: <li><%: Html.ActionLink("Companies", "Index", "Companies")%></li> Run the app and click the menu to see the list of Companies. Te default view is a good starting point, but remove the unnecessary “Id” feld. Because this site is intended to be read-only, delete the ActionLink entries for “Edit,” “Delete” and “Create.” Finally, make the company name itself a link to the Details view: <%: Html.ActionLink(item.Name, "Details", new { id=item.Id })%> Your Companies list should now look similar to what is shown in Figure 7. To implement Details, add a method to CompaniesController: public ActionResult Details(int id) { CSSModelContainer db = new CSSModelContainer(); return View(db.Companies.Single(c => c.Id.Equals(id))); } Te id parameter represents the integer following Companies\ Details\ in the URL. Tat integer is used to fnd the appropriate company using a simple LINQ expression. Add the Details view underneath the Companies folder as before, but this time select “Details” as the View content and name the view “Details.” Run the project, navigate to Companies, and then click one of the company names to see the default Details view. Adding a controller and views for Stories is similar. Te views still require a fair amount of tweaking before being ready for a designer, but this is a good start, and it’s easy to evolve. To verify this project will work on Windows Azure, create a cloud service project called “CSSPublic- Service” and add the CSSPublic role to it. Run the service locally in the dev fabric and then publish the site to Windows Azure and run it from the public URL. Don’t forget to suspend and delete the deployment when you’re done to avoid being billed. Step 5: Create the WCF RIA Services Project Te solution at this point contains ASP.NET MVC and ASP.NET Dynamic Data Web sites running (or at least runnable) on Windows Azure, using a shared Entity Framework model to access a SQL Azure database. Very little manual plumbing code has been required to obtain a decent amount of functionality. Adding a WCF RIA Services Web site adds another dimension: a rich editing experience. Add a Silverlight Business Application project named “CSS- CustomerEdit” (no spaces) and Visual Studio adds two projects to your solution: a Silverlight client (CSSCustomerEdit) and a Web service (CSSCustomerEdit.Web). Run the solution to see what you’re starting with. Open ApplicationStrings.resx in CSSCustomerEdit project and change the value for ApplicationName to “Customer Success Stories” to make it look nice. In CSSCustomerEdit.Web, copy connectionStrings from CSSModel into Web.config, add references to CSSModel and System.Data.Entity and set Copy Local to true for all the references. Ten right-click on the Services folder and add a new Domain Service Class item called “CSSDomainService.” Make sure the name of this class ends in Service—without a number—to gain the full beneft of Figure 9 XAML for the Stories View While prototyping, I prefer to deploy straight to production because I’m not going to leave the solution running anyway. msdn magazine 66 Visual Studio the tooling between the two projects (another “secret handshake”). Click OK to bring up the Add New Domain Service Class dialog and check all the entities along with Enable editing for each (Figure 8). Notice that “Generate associated classes for metadata” is grayed out. This illustrates a tradeoff with the approach I’m espousing here. In a Silverlight Business Application, metadata classes can be used to add additional validation logic such as ranges and display defaults. However, when the Entity Framework model is in a separate project from CSSCustomerEdit.Web, the toolkit doesn’t let you add these metadata classes. If this feature is important to you or if you know you’re going to invest most of your energy in the Silverlight Business Application part of your solution, you might want to create your Entity Framework model directly in the “.Web” project instead of a separate project. You could still reference CSSCustomerEdit.Web to share the Entity Framework model in another project. As mentioned, authentication and authorization are out of scope for this article, but it’s possible to punt and still be precise. In the CSSDomainService class, add a placeholder property named “myCompany” to represent the company the user is authorized to edit. For now, hardcode it to 1, but eventually the login process will set it to the right company for the authenticated user. Edit the CSSDomainService class to refect the specifc use case for the project: the user can update companies but not insert or delete them (an administrator does that in the ASP.NET Dynamic Data Web site), so remove those service methods. Also, the user can only edit the single company they work for, not a list of companies, so change GetCompanies to GetMyCompany. Similarly, change GetStories to GetMyStories and ensure that the user creates stories whose CompanyId is equal to myCompany: private int myCompany = 1; // TODO: set myCompany during authentication public Company GetMyCompany() { return this.ObjectContext.Companies.Single(c=>c.Id.Equals(myCompany)); } ... public IQueryable<Story> GetMyStories() { return this.ObjectContext.Stories.Where(s=>s.CompanyId.Equals(myCompany)); } public void InsertStory(Story story) { story.CompanyId = myCompany; story.Id = -1; // database will replace with next auto-increment value ... } WCF RIA Services shines in the creation of editable, feld-oriented interfaces, but it’s important to start simply and add functionality slowly. The DataGrid and DataForm controls are powerful, but whenever I work too fast or try to add too much functionality at once, I end up messing up and having to backtrack. It’s better to work incrementally and add one UI improvement at a time. To implement a baseline UI for this example, add references to Sys- tem.Windows.Controls.Data and System.Windows.Con trols.Domain- Services in CSSCustomerEdit. Create new views (Silverlight Page items) for Company (singular) and Stories, then mimic the XAML from the existing Home and About views. Edit MainPage.xaml to add new dividers and link buttons (alternatively, just co-opt the existing Home and About views to use for Company and Stories). In Silverlight development, most of the magic involves editing XAML. In CSSCustomerEdit, add namespace entries, a Domain- DataSource and a DataForm for the Company view. In addition, add a DataGrid for the Stories view. In both DataForms, handle the EditEnded event to call MyData.SubmitChanges. Stories.xaml, which should look similar to Figure 9. Build it … run it … it works! A rich editing experience that’s ready to evolve (see Figure 10). As before, create a new cloud service project, publish it and test it on Windows Azure. Copy CSSCustomer- EditTestPage.aspx to Default.aspx for a cleaner experience, and you’re done. No ‘One True Way’ Visual Studio and the .NET Framework provide myriad choices for creating solutions that can run on Windows Azure. While it’s tempting to search for the “one true way” to create the next killer app for the cloud, it’s more reward- ing to leverage capabilities from multiple technologies to solve complex problems. It’s easier to code, easier to evolve and—thanks to Windows Azure—easier to operate. „ PATRICK FOLEY is an ISV architect evangelist for Microsof, which means he helps sofware companies succeed in building on the Microsof platform. Read his blog at pfoley.com. THANKS to the following technical expert for reviewing this article: Hanu Kommalapati Figure 10 The Stories View in Action In Silverlight development, most of the magic involves editing XAML. Create accurate PDF documents in a fraction of the time needed with other tools WHQL tested for all Windows 32 and 64-bit platforms Produce fully compliant PDF/A documents Standard PDF features included with a number of unique features Interface with any .NET or ActiveX programming language High-Performance PDF Printer Driver Edit, process and print PDF 1.7 documents programmatically Fast and lightweight 32 and 64-bit managed code assemblies for Windows, WPF and Web applications Support for dynamic objects such as edit-fields and sticky-notes Save image files directly to PDF, with optional OCR Multiple image compression formats such as PNG, JBIG2 and TIFF ■ ■ ■ ■ ■ USA and Canada Toll Free: 1 866 926 9864 Support: (514) 868 9227 Info: [email protected] Europe Sales: (+33) 1 30 61 07 97 Support: (+33) 1 30 61 07 98 Customizations: [email protected] All trademarks are property of their respective owners. © 1999-2010 AMYUNI Technologies. All rights reserved. www.amyuni.com New Touchscreen Tablet for Mobile Development! The DevTouch Pro is a new color touchscreen tablet designed to provide mobile application developers with a customizable develo pment, testing and deployment platform. Fully open customizable tablet Develop with .NET, Java or C++ Unrestricted development and flexible quantities Fully supported in North America Learn more at www.devtouchpro.com PDF Integration into Silverlight Applications More Development Tools Available at: v4.5! v4.5! New! PDF Editor for .NET, now Webform Enabled Server-side PDF component based on the robust Amyuni PDF Creator ActiveX or .NET components Client-side C# Silverlight 3 control provided with source-code Optimization of PDF documents prior to converting them into XAML Conversion of PDF edit-boxes into Silverlight TextBox objects Support for other document formats such as TIFF and XPS ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ Untitled-3 1 11/8/10 2:53 PM msdn magazine 68 ASP. NET DYNAMI C DATA Build a Data-Driven Enterprise Web Site in 5 Minutes For years, developers have wrestled with the tedious tasks of building data layers with Create, Read, Update and Delete (CRUD) functionality to be consumed by the UI. I can personally remember a project from more than 10 years ago that required me to write code that automatically generated business and data layers from a Rational Rose object model. It was a lot of work. A few years later Microsof introduced the Microsof .NET Frame- work and the DataSet object. With the DataSet designer, you could connect a strongly typed DataSet to a back-end database such as SQL Server and then work with the DataSet throughout the application. Tis minimized the need to work directly with SQL, but it still required the manual creation of Web pages to display data from the database. Fast-forward a few more years and Microsoft introduced the Entity Framework. Tis Object Relational Mapping (ORM) frame- work uses LINQ to abstract a database schema and presents it to an application as a conceptual schema. Like the DataSet, this technology James Henry also minimized the need to work directly with SQL. However, it still required the manual creation of Web pages to display data. Now Microsof has introduced ASP.NET Dynamic Data—a combi- nation of the Entity Framework and ASP.NET Routing—which allows an application to respond to URLs that do not physically exist. With these features you can create a production-ready, data-driven Web site in just a few minutes. In this article I’ll show you how. Getting Started Just to set up the scenario, let’s say I’m building an intranet Web site for a fctional company called Adventure Works. Tis Web site allows the company to manage employee information. The company’s data is stored in a Microsoft SQL Server 2008 database. (You can download and install the database from msftdbprodsamples.codeplex.com.) Now, open Microsof Visual Studio 2010 and create a new ASP.NET Dynamic Data Entities Web Site C# project. The ASP.NET Dynamic Data Entities Web Site project type takes advantage of ASP.NET Routing and the Entity Framework to enable you to quickly create data-driven Web sites. To get this functionality, you need to add an Entity Framework data model to the project. To do this, choose Add New Item from the Web site menu. In the Add New Item dialog box, choose ADO.NET Entity Data Model. Name it HumanResources.edmx. Visual Studio will then prompt you to add the model to the App_Code folder. Choose Yes and allow it to make this change. This article discusses: • Starting a Dynamic Data project • Using ASP.NET Routing • Supporting metadata • Customizing Dynamic Data templates Technologies discussed: ASP.NET, SQL Server 69 January 2011 msdnmagazine.com When you create Web site projects, Visual Studio dynamically compiles all code that is placed in the App_Code folder. In the case of the Entity Data Model, Visual Studio automatically gen- erates a partial class for the data context and partial classes for the entities. In this example, it places the code in a file named HumanResources.Designer.cs. Next, the Entity Data Model Wizard appears, as shown in Figure 1. Choose Generate from Database and click Next. Now choose a connection to the Adventure Works database. If a connection doesn’t already exist, you need to create a new one. Figure 2 shows a connection to AdventureWorks on a SQL Server instance named Dev\BlueVision. On the next page you can choose all tables in the Human Resources schema. The name of the schema appears in parenthesis. Figure 3 shows some of the tables that are selected. When you click Finish, Visual Studio automatically generates entities from the tables that you chose for your project. Figure 4 shows seven entities that Visual Studio generated from the database schema. Visual Studio used the foreign key constraints in the database to create rela- tionships between the entities. For example, the Employee entity participates in a one-to-many relationship with the JobCandidate entity. Tis means that an employee can be a candidate for multiple jobs within the company. Also note that the EmployeeDepartmentHistory entity joins the Employee and Department entities. Had the EmployeeDepartmentHistory table contained only the felds that were necessary for joining the Employee and Department tables, Visual Studio would have simply omitted the EmployeeDepartmentHistory entity. Tis would have allowed direct navigation between the Employee and Department entities. Using ASP.NET Routing ASP.NET Routing allows an application to respond to URLs that do not physically exist. For example, two URLs—http://mysite/Employee and http://mysite/ Department—could be directed to a page at http:// mysite/Template.aspx. Te page itself could then extract information from the URL to determine whether to display a list of employees or a list of departments, with both views using the same display template. A route is simply a URL pattern that’s mapped to an ASP.NET HTTP handler. It’s the handler’s responsibility to determine how the actual URL maps to the interpreted pattern. ASP.NET Dynamic Data uses a route handler named DynamicData- RouteHandler that interprets placeholders named {table} and {action} in URL patterns. For example, a handler could use this URL pattern: http://mysite/{table}/{action}.aspx That pattern could then be used to interpret the URL http:// mysite/Employee/List.aspx. Employee is processed as the {table} placeholder and List is pro- cessed as the {action} placeholder. Te handler could then display Figure 1 Starting the Entity Data Model Wizard Figure 2 Configuring the Data Connection You can create a production-ready, data-driven Web site in just a few minutes. msdn magazine 70 ASP.NET Dynamic Data a list of Employee entity instances. DynamicDataRouteHandler uses the {table} placeholder to determine the name of the entity to display, and it uses the {action} parameter to determine the page template used to display the entity. Te Global.asax fle contains a static method named RegisterRoutes that gets called when the application frst starts, as shown here: public static void RegisterRoutes(RouteCollection routes) { // DefaultModel.RegisterContext(typeof(YourDataContextType), // new ContextConfiguration() { ScaffoldAllTables = false }); routes.Add(new DynamicDataRoute("{table}/{action}.aspx") { Constraints = new RouteValueDictionary( new { action = "List|Details|Edit|Insert" }), Model = DefaultModel }); } Te frst thing you need to do is uncomment the code that calls the RegisterContext method of the DefaultModel instance. This method registers the Entity Framework data context with ASP.NET Dynamic Data. You need to modify the line of code as follows: DefaultModel.RegisterContext( typeof(AdventureWorksModel.AdventureWorksEntities), new ContextConfiguration() { ScaffoldAllTables = true }); Te frst parameter specifes the type of the data con- text, which is AdventureWorksEntities in this example. By setting the ScafoldAllTables property to true, you allow all entities to be viewed on the site. Had you set this prop- erty to false, you’d need to apply the ScafoldTable(true) attribute to each entity that should be viewed on the site. (In practice, you should set the ScafoldAllTables property to false to prevent the accidental exposure of the entire database to end users.) Te Add method of the RouteCollection class adds a new Route instance to the route table. With ASP.NET Dynamic Data, the Route instance is actually a Dynamic- DataRoute instance. The DynamicDataRoute class internally uses DynamicDataRouteHandler as its handler. Te parameter passed to the constructor of Dynamic- Data Route represents the pattern that the handler should use to process physical URLs. A constraint is also set to limit the {action} values to List, Details, Edit or Insert. Tis is theoretically all you need to do to use ASP.NET Dynamic Data. If you build and run the application, you should see the page shown in Figure 5. Supporting Metadata One thing you should notice is that the entity names are identical to the table names that they represent. In code and in the database this might be fne, however, usually this is not the case in the UI. ASP.NET Dynamic Data makes it easy to change the name of a displayed entity by applying the Display attribute to the class that represents that entity. Because the entity class- es are contained in a fle that’s automatically regenerated by Visual Studio, you should make any code changes to partial classes in a separate code fle. Terefore, add a new code file named Metadata.cs to the App_Code folder. Ten add the following code to the fle: using System; using System.Web; using System.ComponentModel; using System.ComponentModel.DataAnnotations; namespace AdventureWorksModel { [Display(Name = "Employee Addresses")] public partial class EmployeeAddress { } } Ten rebuild and run the application. You should notice that EmployeeAddresses is now Employee Addresses. Similarly, you can change the names of EmployeeDepartmentHistories, Employee- PayHistories and JobCandidates to more appropriate names. Figure 4 Entities Generated from Database Tables in Visual Studio Figure 3 Selecting Tables for the Human Resources Schema in Visual Studio (888) 850-9911 Sales Hotline - US & Canada: /update/2011/01 US Headquarters ComponentSource 650 Claremore Prof Way Suite 100 Woodstock GA 30188-5188 USA © 1996-2010 ComponentSource. All Rights Reserved. All prices correct at the time of press. Online prices may vary from those shown due to daily fluctuations & online discounts. European Headquarters ComponentSource 30 Greyfriars Road Reading Berkshire RG1 1PE United Kingdom Asia / Pacific Headquarters ComponentSource 3F Kojimachi Square Bldg 3-3 Kojimachi Chiyoda-ku Tokyo Japan 102-0083 www.componentsource.com www.componentsource.com We accept purchase orders. Contact us to apply for a credit account. BEST SELLER TX Text Control .NET for Windows Forms/WPF from $499.59 Word processing components for Visual Studio .NET. - Add professlonal word processlng to your appllcatlons - Poyalty-free wlndows Porms and wPP rlch text box - True w¥S|w¥G, nested tables, text frames, headers and footers, lmages, bullets, structured numbered llsts, zoom, dlalog boxes, sectlon breaks, page columns - Load, save and edlt DOCX, DOC, PDP, PDP/A, PTP, HTML, TXT and XML NEW VERSION FusionCharts from $195.02 Interactive Flash & JavaScript (HTML5) charts for web apps. - Llven up your web appllcatlons uslng anlmated & data-drlven charts - Create A1AX-enabled charts wlth drlll-down capabllltles ln mlnutes - Lxport charts as lmages/PDP and data as CSv from charts ltself - Create gauges, dashboards, ñnanclal charts and over 550 maps - Trusted by over l7,000 customers and 330,000 users ln ll0 countrles BEST SELLER BEST SELLER Spread for Windows Forms from $959.04 A comprehensive Excel compatible spreadsheet component for Windows Forms applications. - Speed development wlth Spreadsheet Deslgners, Qulck-start wlzard and Chart Deslgners - Automatlc completlon - provlde "type ahead" from user lnput to cell - Peatures bullt-ln tool to deslgn charts wlth 85 dlnerent styles - Preserves Lxcel ñles and restores unsupported functlonallty - |ncludes sleek new predeñned sklns and the ablllty to create custom sklns BEST SELLER BEST SELLER ActiveReports 6 from $685.02 Latest release of the best selling royalty free .NET report writer. - Now supports wlndows Server 2008, 648lt & |L8.0 - Plrst Plash based Peport vlewer for end users - Supports PDP Dlgltal Slgnatures, PSS 8ar Codes and external stylesheets - Peatures an easy-to-use vlsual Studlo report deslgner and a powerful AP| - Oners seamless run-tlme deployment, royalty free BEST SELLER Untitled-1 1 12/2/10 10:39 AM msdn magazine 72 ASP.NET Dynamic Data Next click the Shifs link. Tis displays a list of employee shifs. I’ll change the names StartTime and EndTime to Start Time and End Time, respectively. Another issue is that the Start Time and End Time values show both the date and the time. In this context, the user really only needs to see the time, so I’ll format the Start Time and End Time values so that they display times only. An attribute named DataType allows you to specify a more specifc data type for a feld, such as EmailAddress or Time. You apply the DataType attribute to the feld to which it applies. First, open the Metadata.cs fle and add the following class defnitions: [MetadataType(typeof(ShiftMetadata))] public partial class Shift { } public partial class ShiftMetadata { } Notice that an attribute named MetadataType is applied to the Shift class. This attribute allows you to designate a separate class that contains additional metadata for an entity’s felds. You can’t apply the additional metadata directly to members of the Shif class because you can’t add the same class mem- bers to more than one partial class. For example, the Shif class defned in HumanResources.Designer.cs already has a field named Start- Time. So you can’t add the same field to the Shift class defined in Metadata.cs. Also, you shouldn’t manually modify HumanResources.Designer.cs because that fle gets regenerated by Visual Studio. The MetadataType attribute allows you to spec- ify an entirely different class so that you can apply metadata to a field. Add the following code to the ShifMetadata class: [DataType(DataType.Time)] [Display(Name = "Start Time")] public DateTime StartTime { get; set; } Te DataType attribute specifes the Time enumer- ated value to indicate that the StartTime feld should be formatted as a time value. Other enumerated values include PhoneNumber, Password, Currency and EmailAddress, to name a few. Te Display attri- bute specifes the text that should be displayed as the feld’s column when the feld is displayed in a list, as well as the feld’s label when it’s displayed in edit or read-only mode. Now rebuild and run the application. Figure 6 shows the result of clicking the Shifs link. You can add similar code to change the appearance of the EndTime feld. Now let’s take a look at the JobCandidates link. Te Employee column displays values for the NationalIDNumber field. This might not be useful. Although the Employee database table doesn’t have a feld for an employee’s name, it does have a feld for an Figure 5 A Basic ASP.NET Dynamic Data Site Figure 6 Revised Shifts Page Using MetadataType Definitions Figure 7 Identifying Employees with LoginIDs Visual Studio automatically generates entities from the tables that you chose. Untitled-4 1 12/9/10 1:39 PM msdn magazine 74 ASP.NET Dynamic Data employee’s login, which is LoginID. Tis feld might provide more useful information to a user of this site. To do this, I’ll again modify the metadata code so that all Employee columns display the value of the LoginID feld. Open the Metadata.cs fle and add the following class defnition: [DisplayColumn("LoginID")] public partial class Employee { } Te DisplayColumn attribute specifes the name of the feld from an entity that should be used to represent instances of that entity. Figure 7 shows the new list with LoginIDs. The ContactID field of the Employee entity actually refers to a row in the Contact table, which is part of the Person schema in the database. Because I didn’t add any tables from the Person schema, Visual Studio allows direct editing of the ContactID feld. To enforce relational integrity, I’ll prevent editing of this feld while still allowing it to be displayed for reference. Open the Metadata.cs fle and modify the Employee class by applying the following MetadataType attribute: [DisplayColumn("LoginID")] [MetadataType(typeof(EmployeeMetadata))] public partial class Employee { } Ten defne the EmployeeMetadata class as follows: public partial class EmployeeMetadata { [Editable(false)] public int ContactID { get; set; } } Te Editable attribute specifes whether or not a feld is editable in the UI. Next, I’ll add metadata to the EmployeePayHistory entity to dis- play the Rate feld as Hourly Rate, as well as format the feld’s value as currency. Add the following class defnitions to the Metadata.cs fle: [MetadataType(typeof(EmployeePayHistoryMetadata))] public partial class EmployeePayHistory { } public partial class EmployeePayHistoryMetadata { [DisplayFormat(DataFormatString="{0:c}")] [Display(Name = "Hourly Rate")] public decimal Rate { get; set; } } Customizing Templates Te Visual Studio project contains a folder named FieldTemplates. Tis folder contains user controls for editing felds of various data types. By default, ASP.NET Dynamic Data associates a feld with a user control that has the same name as the field’s associated data type. For example, the Boolean.ascx user control contains the UI for displaying Boolean felds, while the Boolean_Edit.ascx user control contains the UI for editing Boolean felds. Alternatively, you can apply the UIHint attribute to a feld to ensure that a diferent user control is used. I’ll add a custom feld template to display a Calendar control for editing of the Employee entity’s BirthDate feld. In Visual Studio, add a new Dynamic Data Field item to the project and name it Date.ascx. Visual Studio automatically adds a second fle named Date_Edit.ascx to the public partial class Date_EditField : System.Web.DynamicData. FieldTemplateUserControl { protected void Page_Load(object sender, EventArgs e) { DateCalendar.ToolTip = Column.Description; } protected override void OnDataBinding(EventArgs e) { base.OnDataBinding(e); if (Mode == DataBoundControlMode.Edit && FieldValue != null) { DateTime date = DateTime.MinValue; DateTime.TryParse(FieldValue.ToString(), out date); DateCalendar.SelectedDate = DateCalendar.VisibleDate = date; } } protected override void ExtractValues( IOrderedDictionary dictionary) { dictionary[Column.Name] = ConvertEditedValue( DateCalendar.SelectedDate.ToShortDateString()); } public override Control DataControl { get { return DateCalendar; } } } Figure 8 Custom Date_EditField Class Figure 9 Customized Employee Edit Form Starting with the Microsoft .NET Framework 4, we recommend that you use the Display attribute instead of the DisplayName attribute from the .NET Framework 3.5. Display- Name is obviously still in the framework, but we recommend against it whenever the Display attribute can be used. There are two reasons for preferring Display instead of DisplayName. First, the Display attribute supports localization while the DisplayName attribute does not. Second, the Display attribute allows you to control all kinds of things. For example, you can control the text for the various ways a field can be displayed (prompt, header and so on), whether a field shows up as a filter, or whether the field is shown in the scaffold (AutoGenerate=false turns it off ). So, while the code shown in the examples in this article is completely valid, we recommend that you replace DisplayName and ScaffoldColumn with the Display attribute. You still need to use ScaffoldTable and the DisplayName attribute at the class level. Following these recommendations is best because other teams at Microsoft support the DataAnnotations namespace (WCF RIA Services), and following these practices will make sure code works with them. —Scott Hunter, Senior Program Manager Lead, ASP.NET Attribute Changes in the .NET Framework 4 ASP.NET Routing allows an application to respond to URLs that do not physically exist. ement1); areaSeries Add(seriesElement2); areaSeries Add(seriesElement3); // Add series to the plot area plotArea Series Add(areaSeries); //page Elements Add( new LayoutGrid() ); // Add the page elements to the page AddEA ement1); areaSerie ies.AAdd(se (s rriesElement2 t2); a ) reaSeries.Ad A d(seriesElement3); // Add series to the plot area plotArea.Series.Add(areaSeries); //page.Elemen em ts.Add( ddd( new ne La LayyoutGrid() ); // A / dd the page elements to the page AddEA s, 240, 0); AddEAN1 AN 3Sup Sup5(pa 5(p ge.Elemeent nts, 480, 0); AdddUPCVersionA(page.Elemen e ts, 0, 135); AddUPCVersionASup2(page.Elements, 240, 135); Ad ddUPC ddUPC d CVers sionA ionA o Sup5((page.Elemennts, t 480, 135); AddEAN8(page.Elements, 0, .Elements, 480, 2 270); ; Add ddUUPCVersionE(pa page.Elementts, 0, 405); AddUPCVersionESu E p2(page.Elements, 240, 405); AddUPCVersionESup5(page age.Ele .Ele lemmments, 4 s, 4880 0, 4405); // A Add the page to e t the document document.Pages.Add(pa CaptionAndRectang a lee(eleme ements, “EAN/JA /JAN 13 Bar Codde”, x, y, 204, 99); BarCode barCode = new Ean13(“123456789012”, x, y + 21); barCode. ode.X += X + X += X + ((20 04 - 4 - baarCoode.GettSymbolWidth() h ) / 2; elements.Add(barCode); } private vo vo dRectangle(element e s,, “EAN EAN/JAN 13 Bar C ar Code, 2 digit supplement”, x, y, 204, 99); BarCode barCode = new Ean13Sup2(“1 2 234556789 678 012 1212”, 2”, x, yy + 2 + 211); 1); barCoode.XX += (204 - barCode.GetSymbolWidth()) / 2; elements.Add( (barC ts, float x, float y y) { A { AddCa CaptionAndRectan angle(elements, “EAN/JAN 13 Bar Code, 5 5 digit supplement”, x, y, 204, 99); BarCo a C de bbarCo arCCode = de = new n EEan Ean113Su S pp5(“12234556789001212345”, x, y + 21); barCode.X += (204 - barCodee.Get ddUPCVersionA(Grou Group elemen em ts, float x, float floa y) { AddCaptionAndRectangle(element e s, “UPC Version A Bar Code”, x, y, 2 y, 204, 99); 9) 99); 9) Bar Bar B r Ba Code C bar barCCode = neew UpcVe pcVersionA A(“12345678901”, x, y + 21); barCode.X += (204 - baarCo ddUPCVersionAS Sup2( up2 Grou oupp elements,, float oa xx, float y) { AddCaptionAndRectangle(ele ( ments, “UPC Version E Bar Code, 2 digit git sup supp supp up lement” nt”, x, x, x y, 204, 999); BaarCodde barCCode = new UpcVersionASup2(“12345678 7 90112”, xx, y x, y + s.Add(barCode); } } pprivate te vooid AddUPCV PCVersi ers onASup5(Group elements, float x, float o y) { AddCaptionAndRectangle(eleement mment m s, “ s, “UPC UPC Ver Version EE Bar Code, 5 diggit suupplemment”, x, y, 204, 99); BarCode barCode = new UpcVeeersio ode.GetSymbolWWidth( dth )) / 2; 2 elements.Add Add(bar (ba Code); } private e voi v d AddEAN EAN8(Group p elements, float x, float y) { AddC ddCapti tionAn onA n dRec Recc ecttaangle(elemments, “EANN/JANN 8 BBar Codde”, x, y, 204, 99); BarCode barCode == new n Ean8(“1234 34 g(); fileDialog.Title = le = “Op “Open F en File Dialogg”; fil fi eDialog.Filter = “Ad Adobe PDF F files es (*.pdf) f)|*.pdf|All Files (*.*)|*.*”; if (fileDi eDialog. log.Sh SSShowwDDiallog() og == DialoggResult.OK) { pdfViewe ewer.Op penFile(fileDialog.FileName, “”); } Save Sav File F Diallog saav aveF a File Dialog”; ssaveF veFileDialo al gg.Filter = “Ado Adobee PDF files (*.pdf) f)|*.pdf|All Files (**.*)|*.*”; if (saveFileDialog.ShowD owDialo ialo a g()= g()==Dia Di =Dia Di logResul sul e t .OOK) {{ pdfV fViewe ewerr.SaveAs(saveFileDia Dialog.FileNamee); ); } } if (p ( dfVi dfV ewe wer.P Pag P e WithDialog(); } e else se { Mess Me aageBox.SShow( w “P Please open a fi file tto print t”); } OOpenFile F Dialog fileDi D alog log = n = nnnew O ew O e pe pen penFileDDialog(); fi file Dialog.Tiitle = “Open File Dialoog”; fil leDialog.InitialDirec ectoory = @”c:\” :\ ; fi fileD leDialog lo .Filter ter = “ = “All F ) == DialogResul es t.O t.OK) { Dy D nnamicPDF FView ewer rClass test = new D ew Dynam micPDFVie ewerC r lass(); PDF DFPrin Printter prin prin n inter ter er = = test.OpenF pe ileForP orPrinnter (file (fileDia alog.FileName); pprin nter.PrintQuie Quiet(); () } byt by byt by ee[] cont contents t = pServices; GCH GC and andle gcchh = GCHandle d .All Al occ(contents, GCH Hand ndleTyp Type.Pinned d); Int IntPtr cont contents entsIntPtr ===gcch. ch. h.AAAddrOfPinned nn Objeect() ct ;p pdf Viewer.O .OpenB pe pe ufffe fe uff r(co r( r(co r( ntentsIntPtr t , kmark Page Eleme lement:”, x, x, y); y); ppageEleement en s.AAdd(new Bookkmarrk(“ ( BBook B marked Text” x , x , x + 5, + 5, y + 20, 0 par pareen enttO e utline)); pageEl g emennts.A ts.Add (new Label(“This te s texxt is bookma okma okmarked rked ”, .”, xx + 5, y + 20, 2 ageElements, fl s, float a x, float at y) { { // Adds dss a circltt to the pageEl lemeents AddCaptioon nAndR AndRecta ectangle(pagg pag paggeEle eEl ment men s, “Circle P Paage Elemment: ent:”, x, y); pageEl geElements.A s.Add(n dd dd(n dd ew C Circle(x (x + 112.5f 2 , shLarge)); } ppriva vate te void AddF Ad or ormatted teed e TextArrea(Group p p ageeE g lemennts, float x, x, floa float y t y) t { / { / Ad A ds ds a ds a for for fo foo matttedd text area to o tthepageE eElle ments strring formatt m edHt edHtml = “<p “<p>< <i>Dynamic</i><b>PD b>P F</b b>&tm >&tmm tm; Genera eraaator or o v6.0 for or .NE matting suppoort for or text tha th t a appears s in the ddocument. Y t. Youuu havve “ + “com complet et ple e cooontro rol ov ov ov ovver 8 r 8 e par aragraph p ph properties: ssppacing befo eforee, spacing g after, fir first liine “ + “indenta ntation, left indentati tation, righ r t in inndent dent dent ntatio tionn, a , aaliignment, al alllowi <font face=’Tim ’Times’ es’>fontt fac f e, </fonnt> t>><f > ont ppoin o tSiz Size=’6’>fffont “ + “““size, </ </fon fonnt n ><fo <f nt c t coolor lo olor lo l =’FF0000 00 ’>>coloor, </font>><b>b <b> old, </b </b><<i>italic aannd </i>< <u>uunderline</u> >; “ + “and 2 line pro opert rties: lea eading ng, an ndd le le eeaadin aad g type. Text extArea = neew Fo Format rm tedT dTextAArea(for orrm mattedHHtm tml, x + 5, y +++ 20,, 21555, 60, F , Font ontFamil mmi y.He Heelv ve lvet etica ica, 9, ffalse) e); // S Sets the the indent prope roperty fo formatte att ddTextAreeaa.Styyle.PParagrapph.Inndent = 18; AddCCapttionAndRect Rectangl gle(pa e(pa pa (paa pa e geE geElements, ts, “F ageEleme m ntts, “Fo “Form mattedT dTextA tArea OOvvverflow flow TText:”, x + 279 99, y); pag pag geEle eEleement men ss.Ad Ad Add(fo d(foo dd orm rrmat rmat mat atttedTextA tArea)); // C Create e an o a verflow formatted ed ttext ar t a ea for t r the ooverflflow textt FoormattedTextArea ov a oveerflowFor Formatt ma edTe TTeextAr xtAr xtAr t xttArea = ea = e formatte a(x + 284, y + 20) 20); pa p geEElemen ements.Adddd(o (overfl verflowwFo wFo ow rmat a rm tedTeextA e rrea); } priv private ate vvooid o v Add Add A d A Imag mag mag mage(Group up paggeElem emeents, float x, float y) { // A / dds an i n magee tto thhe paageElem men ents AddCaptionAndRe dRectangle((pag pageElem mments ennts ts nts, “Image gee Pag es/DPDFLoogo.pn .png”), x + + 1112.55f, y ++ 550f, 50f, 0.2 0.244f); 4f // Image is sizeed annnd cente enteredd in t n t the rrrrecta ta ec ngle ngle ima m ge.SetBo B unds(215, 60); image.VAlign = VAlign.Cen enterr; ima age.Alignn = Align.Center; paggeE eElements.Ad .Addd(imaggee) g ;; } } priv vate ate vvoid v A pageElemen nts Ad AddCap dC tion onAndRectaanngle ngle(pag (paggeeEle eElements, “LLabell & PPage PageNumbeerin erin e gL gLab g bel PPage Elem em E entts:”, x, y); string labelText = “Labels can be rot taated”; strring number mbe Text = “PageNummbeeringLabels els cont contai ain ppage nummbeeri b ng TT xt, x + 5, y + 12 + 12, 22 220, 80, F 0 ont t.Time messRom Roman, an, 12, TextAlign. .Cennter); ; l lab abel.Ang Ang glle = 8; 8; 8; Page Page eNNum Numb N erin ri gLa gLabel pageNumLabel = new PageNumber b ingLabel ab (nnumbberText, x + x + 5, y + 55, 220, 8 80, F Font.TimesR esRoman, 12 12, TextAl tAlig ign ign. n Ce me m nts.Add(labe abel); l); } private vo e id A AddL dLinne(G ne(Group p pageElemennts, fl flfloat x, floa oat y) { {{ ) { /// Addss a l a ll a inne to the p he pageE ag lements AddCaptionAndRectangle(p ( ageEleme e nnts, “ “Line Paage Element:”, x, y); p pageeElemen nts.A s.Add(neew L w ine(x + x + + x 5, y 5 + w Li w ne(x x + 2 + 220, y + 20, x + + 5, y y + 8 + 0, 0, 3 0, 3, Rg , RgbCoolor.Green)); } pr priv iva ate v void A d AddLiink nk(Group up ppag pa p eElement ments, float x, float y) { // Adds a link to the ppageEleme em ntts Foont font == Foont.TimesRoman;; st string text = “TThis T iss s a li a nk t nk t k o o Dynaamic ment m :”, x, y); Label label == new ne La Labbbel(textt, x + 5, y + 20, 215 5, 5, 800, fonnnnt, 1 t, 2, R 2, RgbbColor. or.BBlue lu ); l ; abel.Und Under erline = true; Link link = new Link(x + 5, y + 20, font. on GGetTeextWidthh(texxt, 12), 12 - font.GGetDDescendder(1 r(12), ne eww ee Url UrlA lAction(“ n(“h http Ele E mennts.Add(li ( nk);; } p } priva vate v e voidd AAddPath( ath Grroup pageElemmentts, float oat fl x, floatt y) y) {{ // Add Ad s a s pathh to the pageElements ceTe.DynamicPDF.PageElement en s.Pathh path = nneww ceTe.DynamicPD PDF.P F.PageElemen mennnts.P s.Paaath( h(x + x + 55, y , y ++ 2 + 20, R Path P s.AAdd(new LineeSubPat Pa h(xx ++ 2215, y + 4 + 0))); path.Su h.S bPat ths.A hs.A AAdd dd((new Curv urve eToSubPat Pa h(x h(x + 10 08, y + 80, x + 160, y + 80)); path.SubPaths.Add(new w CCurv veSuubPath(x + 55, y + 40, x + 65, 6 y + y + 80, x + 5, y 5, yy + + 60))); Add AddCCa Capt C ionAAnd AAdd(ppaaath); } privatee void AAddRRec cttaangle(Gr G oup p pageeEElemennnts, flflflo float at x, float y at y)) oordere e dLis dL t = t = ord deredList.GetOverFlowList(x + 5, y + 20); AddCaptionAn AndRRectaanggle(pagge.Elements, “Order r ed L ed List Page ge e Ele Ele El menttt n OOOve Ove v rflow rfl :”, x, y, 2 88; / 8; // C / Create an unoordereedd list U Unnor nordered e List u unorder er e edListt = t = stt ne eew UUno norder rderedLLi ed st(xx + 55, yy + 20 + 20, 400, 90, Font.Helvetica, 10); unorderedList.Items.Add( Add(“Fruits””); uunordered ere List.Items.Add(“ d “Vege Vege g tablees””); Un UU ordeee r redS reedS d ubbList unord tt((); (); unorderedSubList.Items. ms.Add(“ dd((“ “ Citrus”); unord ordered ed ed eredSu SubLiist.Iteeemss.Ad Addd(“ Nonn-Citr t us”) s” ; Ad AddCCaptionAndRectangle(page.Elemennts, “Unordered Lis st Pagee Elemment e :”, x, y x, y, 225, 110); Uno nn rddereedSubLis bLis st un unn u orde deredS redS d redS dd ubLi ub st2 = uno red er er SubbbList2.Items.Add((“Po Potato”); unorderedSSubLii ubLisstt2.Itemmms.Ad dddd(“B Beans”); Unor no dere deredSub dSubLisst subUnorderedSubList = unordered e SubL S ist.Items[0]].Su ubLists.AAddddUnorder rde edSubList(); sub s bUnor UnorderedSub dSSubbLList LLL .Ite teeems.A ms.A m Addd(“Lime”); s LList subbbUnorderedSSubList2 st = unorderedSubL bL S iist. ist.Itemss[1].Su ubLis sts.AAddUnorde eredSu edS bLis bL t(); t() su ubUnorderedSubList2.Items.Add(“Man a go”); subUnorrdereedSSubList2.I t2 temms.A Add(“Banana”); ); Un UnordderedSSu Sub dS List Liss Lis sub ubUnor n dere de dSubList tt(() (); subUUnordereddSSubList3.Items.Add(“Swe Swee w t Po Potato””); Uno orderre d dSSubList sub bUnor Uno dere er dSub dSubList List44 = unorrderedSubList2.It 2 ems[1].S ].SubLists.AddUnoordeereddSub bList( s ); subUn ubU orderedSubLi b ubList st4. s Iteems.Ad Ad Ad A d(“S d(“S “Strin ing Bee Be ean”) an” ; subUno U rde AAdddd(“Kiddney Bean ean”); x += 279; pag a e.Eleme ements.Addd(u d nnordere edLisst); uunorderedList Lis == unord nordered ere List.GetOver e FlowList(x + 5, y + 20); ) Add A CaptionAndRecctanngle(ppage ag .Elemeents, “Unor Unordere dere r d Li d st PPage e Elem Elemmmeent ee Ove ver ve flow: flow: flo ”, x, y, 225 ooiddd AdddTe dTextF xtField(Group pageElemen me ts, , flo float x,, flo oat y)) { TexxtField txtt = new TextFi xtF eeld(“txt “t fna fname”, x + 20, y + 40, 120, 20); txt.Defaul u tValue = “This iis s a Scrrollabble T e extFFieldd”; txt.Borde derCol rCol C or = o Rgb RgbCColoor.B r.B r.B r.Black; txt xt txt xt.Bac ackgro kgroundC un o (t d( d xt); TTex TextField txt1 = new TextFi Field( ld “txxtf1na f1 me”, me” x + 175, y y + 4 40, 120, 20); txtt1.Def De ault u Valu alue = “TextField”; txt1.Password = true; ttxt1.MaxLength = = 99; txtt1.Bo ordderColor = RgbCollor.BB or.Black; txt1.B 1.Backg ckgroun ou dCol olor = or or = or = Rgb Rgb R Colo Color Al r.Al er ee ies(); pieSeries.DataLabel == da; a; da plo p tAre Areaa.Se a ries rie .Add(pieSSeries s); p pieSeries.Eleme lementss.Add( Add(27, 27, “Website A”); pieSeries.Elements.Add d (19, “Website BB”)); pieSerries es.Element men s.Add(21 d(21, “W WWebsi s eb te C ee ”); ”); ); pieS pieS p eries.El Elemen me ements[0 ts[0 s[0 s[0].Co ].C lor or == a es ess.Elements[2].Color = aut utogra og dien dientt3;”RgbC RgbColoor.Alice eBlue; txt2.Too .ToolTip = “Multilinnee”; page pageEl Elements.Add(txt2); AddCaptionAndRectangle(pageEleme mennts, “TexxtFi Field Form orm Pag Page El e Elemen emen emennnt:”, :”, x, y, 5 , 5 y 04 04, 85); 5) } p } riva ate v e v e ooid oid AddC dComb omb Comb CCC ooBox(“cmmbnam bna e”, e”, x + x + 51, 51, y + y + 40, 40, 150, 15 220); cb.BBorderColoor = RgbColor.Blac Black; ccb.Ba b.Back ckgroundColor = RgbColor.AliceBlue; cb.Font = Font.Helve elveticaa; cbb.Fon Fo tSizz Sizze = e 12; cb.I cb.Itemss temss tems.Add A (“Item 1 em e ”); ); cb. cb.Items tems.Add Ad Ad .Add(“It (“It (“It ( Item 2 em ”); ”); cb cb di dd table””) ”); cb.Item ms[ s[“ [“Editaabble”].Select cted ed = true; c ; cb.Editable = tru ue; c cb.ToolTip == “Edi “Ed tabl ab e Co C mmbo Box”; pageElements.Add(cb); ComboBox cb1 = new C ew ombo ombbb Box( B x(“cmb mb1nammee”, x + 303, 3303, y + y + 40, 150, 20 20); c ); cb1.BB b1 orde de de rderrColor == R = F == ont.HHHelveticca; a; c a; cbb1.FontS nt ize = 122; cb1.Itemss.AAdd(“IItem 1”); ccb1.Ittems.Add(“It “Item 2 em ”); ”) cb1. cb1.It Items.Add(“Item 3”); cb1.Items.Add(“Item 4”); cb1.Itemss.AAdd d(“No on-Edi ditab tablee”); ); ccb1.Items[““ [“Non- Non-Edit Editable able”].S ”].Seelected = tr = ue; ue; cb1. 1 Edita nt nnt ts.Ad s dd(cb (cb (cb1); Converter.Co C nvert(“http://www.go goog ogle.ccom”, “Output t.pdf”);Convert ve er.C er.Conve onvert(GetDocPath(“DocumentA.rtf”), “Output.pdf”);System.Dia iagno oostics css.Pro o PP cesss ess.SStart(“Outp tput.pp ut.pdf”) df”); As ; AsyncC ncConverter rt aCoo Coonnve nverteer = new A errr((aC (aCo ( nverrter_Converted); aConverter.ConversionErroor += nnew ConnversionErrorEv ventH tHandler(aConverter_ConversionError); aConverter.Convert(@”C:\t C:\ emmp\ mp mm DDocummen ment tAA.rtf”, @”C:\tem em mmp\Ou p\Output tputA.pd A.pdf”); f”); ) aConver v rter. ter.Coonvert(@”C ver ve t(@”C:\temp\DocumentC.rtf”, @”C:\temp\OutputCC.pdf”) ); aCoonveerter.Convert( e “h http:// p://www.yahoo.com”, @”C:\Temp\yahoo.pdf”); ConversionOptions on i oopt op ionns ns ns == new CConversio sionOpt nOpttions ions(720 (720, 72 , 720, 72, ttrue); ceeTe.D Te. yynamicPDF temp te \\o output.pdf”, options); ceTe.DynamicPDF.Conveersion.Connvertter.Convert(“C:\ \\teemp\\Document2.docx”, “C:\\temp\\output.pdf”, options); string s g ammmp ampl mp am eHtm H ml = l “<ht h ml>< ml><body body><p> ><p> pp TThis is a very s simpl m e HTML ML strring includ <tab <t le bborder=\”1\”> 1 <tr><td>100</td><td>200</td>”” + “<ttd>3300<</td></tr><tr><<td>>400</td><td>500</td><td>600</t < d></tr></table>< ></bod /body>< y></ / </ </hht htm html htm htm>”;Conve ve o rsion.Co n.Co CC nver nverter. ter.CConvvertHtmlString(saample mpleHtmll, “C “C:\\\temp\ emp\\Sam \Sam er erName e”, Path.Comb o ine(GetPath(), “LetterPortrait.pdff”)); prrintJoob.DDocumentNamee = “Lett Letter P e ortrait”; if (printJob.P ob. rinter.Color) prin prin prin prin pri tJob tJob P .PrintOp ntO n tions.Co s. lor o = tr true; ue; if ( if (prin prin ri tJob tJo .Printer.Col C late) printJob.P b.P .PPrrint r OOpti Opti p ons. ons. on ons Co Cooll ollate at = tr = tr = uu rr ;<M<CFG<;=FI @EKL@K@M<LJ< innt ;p pd t:: , TRY 0UR PDF S0LUT!0NS FREE T0DAY! www.DynamicPDF.ccm/evaI or call S00.6S!.500S | +! 4!0.772.S620 Dyraæ|cPDf÷Coæprebers|ve PDf So|ut|ors for .NLI Deve|opers ceTe 5oItware's DynamicPDF products provide real·time PDF generation, manipulation, conversion, printing, viewing, and much more. Providing the best oI both worlds, the ob|ect models are extremely hexible but still supply the rich Ieatures you need as a developer. Peliable and eIñcient, the high· perIormance soItware is easy to learn and use. II you do encounter a question with any oI our components, simply contact ceTe 5oItware's readily available, industry·leading support team. WWW.DYNAM!CPDF.C0M Untitled-3 1 12/7/10 3:43 PM msdn magazine 76 ASP.NET Dynamic Data FieldTemplates folder. I’ll frst replace the content of the Date_Edit.ascx page with the following markup: <asp:Calendar ID="DateCalendar" runat="server"></asp:Calendar> I’ll then modify the Date_Edit.ascx.cs fle with the complete class defnition shown in Figure 8. I override the OnDataBinding method to set the SelectedDate and VisibleDate properties of the Calendar control to the value of the FieldValue feld. Te FieldValue feld is inherited from FieldTemplateUserControl, and it represents the value of the data feld being rendered. I also modify the ExtractValues overrid- den method to save any changes to the SelectedDate property to a dictionary of feld name-value pairs. ASP.NET Dynamic Data uses the values in this dictionary to update the underlying data source. Next, I need to inform ASP.NET Dynamic Data to use the Date.ascx and Date_Edit.ascx field templates for the BirthDate feld. I can accomplish this in one of two ways. First, I can apply the UIHint attribute as follows: [UIHint("Date")] public DateTime BirthDate { get; set; } Alternatively, I can apply the DateType attribute as follows: [DataType(DataType.Date)] public DateTime BirthDate { get; set; } Te DataType attribute provides automatic mapping by match- ing the data type name to the name of the user control. Te UIHint attribute gives you greater control in situations where the feld type doesn’t match the name of the user control. Figure 9 shows the result of editing an employee. If you change the birth date of the selected employee and click Update, the new data will be saved to the database. Te PageTemplates folder contains page templates that render the appropriate views for entities. By default, fve page templates are supported: List.aspx, Edit.aspx, Details.aspx, Insert.aspx and ListDetails.aspx. Te List.aspx page template renders a UI for dis- playing entities as tabular data. Te Details.aspx page template renders a read-only view of an entity, while the Edit.aspx page template displays an editable view of an entity. Te Insert.aspx page renders an editable view with default feld values. Te ListDetails. aspx page template allows you to view a list of entities as well as details for the selected entity on a single page. As I mentioned earlier in the article, ASP.NET Dynamic Data automatically routes URL requests to the appropriate page by examining the value of the {action} parameter defned for the route. For example, if ASP.NET Dynamic Data evaluates an {action} parameter as List, it uses the List.aspx page template to display a list of entities. You can alter the existing page templates or add a new one to the folder. If you add a new one, you must ensure that you add it to the route table in the Global.asax fle. Te EntityTemplates folder contains templates for displaying entity instances in read-only, edit and insert modes. By default, this folder contains three templates named Default.ascx, Default_Edit.ascx and Default_Insert.ascx, which display entity instances in read-only, edit and insert mode, respectively. To create a template that applies to only a specifc entity, simply add a new user control to the folder and give it the name of the entity set. For example, if you add a new user control named Shifts.ascx to the folder, ASP.NET Dynamic Data uses this user control to render the read-only mode for a Shift entity (Shifts entity set). Similarly, Shifts_Edit.ascx and Shifs_Insert.ascx render the edit and insert modes of the Shif entity, respectively. For each list of entities, ASP.NET Dynamic Data uses the entity’s foreign key felds, Boolean felds and enumeration felds to build a flter list. Te flters are added as DropDown controls to the list page, as shown in Figure 10. For Boolean filters, the DropDownList control simply contains three values: All, True and False. For enumeration filters, the Drop- DownList control contains all enumerated values. For foreign key filters, the DropDownList control contains all distinct foreign key values. The filters are defined as user controls in the Filters folder. By default, only three user controls exist: Boolean.ascx, Enumeration.ascx and ForeignKey.ascx. Ship It! Although this was a fctional scenario, you’ve seen that it’s entirely possible to create a fully operational Human Resources Web site in just a few minutes. I then proceeded to enhance the UI by adding metadata and a custom feld template. ASP.NET Dynamic Data provides out-of-the-box functionality that allows you to get a site up and running quickly. However, it’s also entirely customizable, allowing it to meet the needs of indi- vidual developers and organizations. ASP.NET Dynamic Data support for ASP.NET Routing allows you to reuse page templates for CRUD operations. If you’ve become frustrated with having to continually perform the tedious tasks of implementing CRUD pages on every Web application project, then ASP.NET Dynamic Data should make your life a lot easier. „ JAMES HENRY is an independent sofware developer for BlueVision LLC, a company that specializes in Microsof technology consulting. He’s the author of “Developing Business Intelli gence Solutions Using Information Bridge and Visual Studio .NET” (Blue Vision, 2005) and “Developing .NET Custom Controls and Designers Using C#” (Blue Vision, 2003). He can be reached at [email protected]. THANKS to the following technical expert for reviewing this article: Scott Hunter Figure 10 Including Data Filters on the Page You can apply the UIHint attribute to a field to ensure that a different user control is used. Untitled-1 1 1/11/10 10:55 AM msdn magazine 78 Ten, once the class in question has been created, it needs only to be compiled and either added to the project or else compiled into its own assembly for reuse as a binary. Of course, the language being generated doesn’t have to be the language in which the code generator is written—in fact, it will often help immensely if it isn’t, because then it will be easier to keep the two more clearly distinct in the developer’s head during debugging. Commonality, Variability and Pros and Cons In the commonality/variability analysis, automatic metaprogram- ming occupies an interesting place. In the Figure 2 example, it places structure and behavior (the outline of the class above) into commonality, allowing for variability along data/type lines, that of the type being stored in the generated class. Clearly, we can swap in any type desired into the ListOf type. Multiparadigmatic .NET, Part 5: Automatic Metaprogramming In last month’s piece, objects came under the microscope, and in particular we looked at the “axis” of commonality/variability analysis that inheritance ofers us. While inheritance isn’t the only form of commonality/variability available within a modern object- oriented (OO) language such as C# or Visual Basic, it certainly stands at the center of the OO paradigm. And, as also discussed, it doesn’t always provide the best solution to all problems. To recap, what we’ve discovered so far is that C# and Visual Basic are both procedural and OO languages—but clearly the story doesn’t stop there. Both are also metaprogrammatic languages—each ofers the Microsof .NET Framework developer the opportunity to build programs out of programs, in a variety of diferent ways: automatic, refective and generative. Automatic Metaprogramming Te core idea behind metaprogramming is simple: the traditional constructs of procedural or OO programming haven’t solved quite all of our sofware design issues, at least not in a way that we fnd satisfying. For example, to cite a basic faw, developers frequently found a need for a data structure that maintained an ordered list of some particular type, such that we could insert items into the list in a particular slot and see the items in that exact order. For performance reasons, sometimes the list wanted to be in a linked list of nodes. In other words, we wanted an ordered linked list, but strongly typed to the type being stored within it. Developers who came to the .NET Framework from the C++ world know one solution to this problem—that of parameterized types, also known more informally as generics. But, as developers who came to .NET through Java’s early days know, another solu- tion emerged long before templates (which did, eventually, make it into the Java platform). Tat solution was to simply write each needed list implementation as necessary, as shown in Figure 1. Now, obviously, this fails the Don’t Repeat Yourself (DRY) test— every time the design calls for a new list of this kind, it will need to be written “by hand,” which will clearly be a problem as time progresses. Although not complicated, it’s still going to be awkward and time-consuming to write each of these, particularly if more features become necessary or desirable. Of course, nobody ever said developers had to be the ones writ- ing such code. Which brings us around neatly to the solution of code generation, or as it’s sometimes called, automatic metaprogramming. Another program can easily do it, such as a program designed to kick out classes that are customized to each type needed, as shown in Figure 2. THE WORKING PROGRAMMER TED NEWARD Class ListOfInt32 Class Node Public Sub New(ByVal dt As Int32) data = dt End Sub Public data As Int32 Public nextNode As Node = Nothing End Class Private head As Node = Nothing Public Sub Insert(ByVal newParam As Int32) If IsNothing(head) Then head = New Node(newParam) Else Dim current As Node = head While (Not IsNothing(current.nextNode)) current = current.nextNode End While current.nextNode = New Node(newParam) End If End Sub Public Function Retrieve(ByVal index As Int32) Dim current As Node = head Dim counter = 0 While (Not IsNothing(current.nextNode) And counter < index) current = current.nextNode counter = counter + 1 End While If (IsNothing(current)) Then Throw New Exception("Bad index") Else Retrieve = current.data End If End Function End Class Figure 1 An Example of Writing List Implementations as Necessary Toll Free USA (888) 774-3273 | Phone (913) 390-4797 | [email protected] Download the FREE fully functional 30-Day evaluation of SpreadsheetGear 2010 today at www.SpreadsheetGear.com. ASP.NET Excel Reporting Easily create richly formatted Excel reports without Excel using the new generation of spreadsheet technology built from the ground up for scalability and reliability. Excel Compatible Windows Forms Control Add powerful Excel compatible viewing, editing, formatting, calculating, charting and printing to your Windows Forms applications with the easy to use WorkbookView control. Create Dashboards from Excel Charts and Ranges You and your users can design dashboards, reports, charts, and models in Excel rather than hard to learn developer tools and you can easily deploy them with one line of code. Microsoft Chose SpreadsheetGear... “After carefully evaluating SpreadsheetGear, Excel Services, and other 3rd party options, we ultimately chose SpreadsheetGear for .NET because it is the best fit for MSN Money.” Chris Donohue, MSN Money Program Manager Untitled-9 1 11/2/10 12:10 PM msdn magazine 80 The Working Programmer But automatic metaprogramming can reverse that, too, if neces- sary. Using a rich templating language, such as the Text Template Transformation Toolkit (T4) that ships with Visual Studio, the code generation templates can do source-time decision making, which then allows the template to provide commonality along data/ structural lines, and vary by structural and behavioral lines. In fact, if the code template is suf ciently complex (and this isn’t necessarily a good angle to pursue), it’s even possible to eliminate commonality altogether and vary everything (data, structure, behavior and so on). Doing so typically becomes unmanageable quite quickly, however, and in general should be avoided. Tat leads to one of the key realiza- tions about automatic metaprogramming: Because it lacks any sort of inherent structural restrictions, choose the commonality and variability explicitly, lest the source code template grow out of control trying to be too fexible. For example, given the ListOf example in Figure 2, the commonality is in the structure and behavior, and the variability is in the data type being stored—any attempt to introduce variability in the structure or behavior should be considered to be a red fag and a potential slippery slope to chaos. Obviously, code generation carries with it some signifcant risks, particularly around areas of maintenance: Should a bug be discovered (such as the concurrency one in the ListOf… example in Figure 2), fxing it isn’t a simple matter. Te template can obviously be fxed, but that doesn’t do anything for the code already generated—each of those source artifacts needs to be regenerated, in turn, and this is something that’s hard to track and ensure automatically. And, what’s more, any handmade changes to those generated fles will be intrinsically lost, unless the template-generated code has allowed for customizations. Tis risk of overwrite can be mitigated by use of partial classes, allow- ing developers to fll in the “other half ” (or not) of the class being generated, and by extension methods, giving developers an opportu- nity to “add” methods to an existing family of types without having to edit the types. But partial classes must be in place from the beginning within the templates, and extension methods carry some restrictions that prevent them from replacing existing behavior—again leaving neither approach as a good mechanism to carry negative variability. Code Generation Code generation—or automatic metaprogramming—is a technique that’s been a part of programming for many years, ranging all the way from the C preprocessor macro through to the C# T4 engine, and will likely continue to carry forward, owing to the conceptual simplicity of the idea. However, its principal faws are the lack of compiler struc- ture and checking during the expansion process (unless, of course, that checking is done by the code generator itself, a task that’s harder than it sounds) and the inability to capture negative variability in any mean- ingful way. Te .NET Framework ofers some mechanisms to make code generation easier—in many cases, those mechanisms were intro- duced to save other Microsof developers some grief—but they won’t eliminate all the potential pitfalls in code generation, not by a long shot. And yet, automatic metaprogramming remains one of the more widely used forms of metaprogramming. C# has a macro preproces- sor, as do C++ and C. (Using macros to create “tiny templates” was common before C++ got templates.) On top of that, using metapro- gramming as part of a larger framework or library is common, particularly for inter-process communication scenarios (such as the client and server stubs generated by Windows Communica- tion Foundation). Other toolkits use automatic metaprogramming to provide “scaffolding” to ease the early stages of an application (such as what we see in ASP.NET MVC). In fact, arguably every Visual Studio project begins with automatic metaprogramming, in the form of the “project templates” and “item templates” that most of us use to create new projects or add fles to projects. And so on. Like so many other things in computer science, automatic metaprogramming remains a useful and handy tool to have in the designer toolbox, despite its obvious faws and pitfalls. Fortunately, it’s far from the only meta-tool in the programmer’s toolbox. „ TED NEWARD is a principal with Neward & Associates, an independent frm specializing in enterprise .NET Framework and Java platform systems. He’s written more than 100 articles, is a C# MVP and INETA speaker, and has authored and coauthored a dozen books, including “Professional F# 2.0” (Wrox, 2010). He also consults and mentors regularly. Reach him at [email protected] and read his blog at blogs.tedneward.com. Sub Main(ByVal args As String()) Dim CRLF As String = Chr(13).ToString + Chr(10).ToString() Dim template As String = "Class ListOf{0}" + CRLF + " Class Node" + CRLF + " Public Sub New(ByVal dt As {0})" + CRLF + " data = dt" + CRLF + " End Sub" + CRLF + " Public data As {0}" + CRLF + " Public nextNode As Node = Nothing" + CRLF + " End Class" + CRLF + " Private head As Node = Nothing" + CRLF + " Public Sub Insert(ByVal newParam As {0})" + CRLF + " If IsNothing(head) Then" + CRLF + " head = New Node(newParam)" + CRLF + " Else" + CRLF + " Dim current As Node = head" + CRLF + " While (Not IsNothing(current.nextNode))" + CRLF + " current = current.nextNode" + CRLF + " End While" + CRLF + " current.nextNode = New Node(newParam)" + CRLF + " End If" + CRLF + " End Sub" + CRLF + " Public Function Retrieve(ByVal index As Int32)" + CRLF + " Dim current As Node = head" + CRLF + " Dim counter = 0" + CRLF + " While (Not IsNothing(current.nextNode) And counter < index)"+ CRLF + " current = current.nextNode" + CRLF + " counter = counter + 1" + CRLF + " End While" + CRLF + " If (IsNothing(current)) Then" + CRLF + " Throw New Exception()" + CRLF + " Else" + CRLF + " Retrieve = current.data" + CRLF + " End If" + CRLF + " End Sub" + CRLF + "End Class" If args.Length = 0 Then Console.WriteLine("Usage: VBAuto <listType>") Console.WriteLine(" where <listType> is a fully-qualified CLR typename") Else Console.WriteLine("Producing ListOf" + args(0)) Dim outType As System.Type = System.Reflection.Assembly.Load("mscorlib").GetType(args(0)) Using out As New StreamWriter(New FileStream("ListOf" + outType.Name + ".vb", FileMode.Create)) out.WriteLine(template, outType.Name) End Using End If Figure 2 An Example of Automatic Metaprogramming Untitled-1 1 6/9/10 11:03 AM msdn magazine 82 Games and Components Just as the main class of a WPF, Silverlight or Windows Forms pro- gram is a class that derives from Window, Page or Form, the main class of an XNA program derives from Game. An XNA program uses this Game derivative to draw bitmaps, text and 3D graphics on the screen. To help you organize your program into discrete functional entities, XNA supports the concept of a component, which is perhaps the closest that XNA comes to a control. Components come in two varieties: GameComponent derivatives and DrawableGame- Component derivatives. DrawableGameComponent itself derives from GameComponent, and the names suggest the diference. I’ll be using both base classes in this exercise. Te Game class defnes a property named Components of type GameComponentCollection. Any components that a game cre- ates must be added to this collection. Doing so ensures that the component gets the same calls to various overridden methods that the game itself gets. A DrawableGameComponent draws on top of whatever the Game draws. Let’s begin the analysis of the XnaColorScroll program—not with the Game derivative, but with the GameComponent derivatives. Figure 2 shows the source code for a component I call TextBlock, and it’s used in much the same way as a WPF or Silverlight TextBlock. Notice the public properties following the constructor in Figure 2. Tese indicate the text itself, and the font, color and position of the text relative to the screen. Two additional properties, Horizontal- Alignment and VerticalAlignment, allow that position to reference A Color Scroll for XNA One of the frst Windows programs I wrote for publication was called COLORSCR (“color scroll”), and it appeared in the May 1987 issue of Microsof Systems Journal, the predecessor to this magazine. Over the years, I’ve frequently found it instructive to rewrite this program for other APIs and frameworks. Although the program is simple—you manipulate three scrollbars or sliders for red, green and blue values to create a custom color—it involves crucial tasks such as layout and event handling. In functionality, as well, the program is not strictly a simple, meaningless demo. If you ever found yourself creating a color-selection dialog, this program would be a good place to start. In this article, I want to show you how to write a color-scroll program using the XNA Framework as implemented in Windows Phone 7. Te result is shown in Figure 1. XNA is Microsoft .NET Framework-based managed code for games programming, but if you start searching through the Microsof.Xna.Framework namespaces, you won’t fnd any scroll- bars or sliders or even buttons among the classes! XNA is simply not that kind of framework. If you want a slider in your program, it must be specially created for that job. Showing you how to write an XNA color-scroll program isn’t just an academic exercise, however. My intention is actually much broader. As you know, Windows Phone 7 supports both Silverlight and XNA, and generally for any particular application the choice between the two is obvious. However, you might find that the graphical demands of your application are simply too much for Silverlight, but at the same time you might be hesitant about using XNA instead because it’s missing familiar amenities, such as buttons and sliders. My response is: Do not be afraid. Writing simple controls in XNA is easier than you probably think. You can also consider this article as an introduction to XNA programming for Windows Presentation Foundation (WPF) and Silverlight programmers. I found myself applying certain concepts from WPF and Silverlight to this job, so the XNA color-scroll pro- gram is a celebration of synergy as well. All the downloadable source code is in one Visual Studio solution called XnaColorScroll. Tis solution includes an XnaColorScroll proj- ect for Windows Phone 7 and a library called Petzold.Xna.Controls containing the four XNA “controls” that I created for this program. (Obviously, loading the solution into Visual Studio requires that you have the Windows Phone Development Tools installed.) Feel free to steal these controls and supplement them with your own for your own XNA controls library. UI FRONTIERS CHARLES PETZOLD Code download available at code.msdn.microsoft.com/mag201101UIFrontiers. Figure 1 The Xnacolorscroll Program 83 January 2011 msdnmagazine.com the side or center of the text string. In XnaColorScroll, these prop- erties are handy for centering the text relative to the sliders. Te class also includes a get-only Size property, which returns the size of the rendered text with that particular font. (Te XNA Vector2 structure is ofen used for representing two-dimensional positions and sizes as well as vectors.) Te TextBlock component overrides both the Update and Draw methods, which is normal for a drawable component and normal for Game derivatives as well. Tese two methods constitute the program’s “game loop,” and they’re called at the same rate as the video display—30 times per second for Windows Phone 7. You can think of the Draw method like a WM_PAINT message or an OnPaint override, except that it’s called for every screen refresh. Te Update method is called before each Draw call and is intended to be used for performing all the necessary calculations and preparation for the Draw method. TextBlock as written has lots of room for performance improve- ments. For example, it could cache the value of the Size property or only recalculate the textOrigin feld whenever any property that contributes to it changes. Tese enhancements might take priority if you ever discover that the program has become sluggish because all the Update and Draw calls in the program are taking longer than 0.03 seconds to execute. Components and Child Components As you can see in Figure 1, XnaColorScroll has six TextBlock components but also three sliders, which are probably going to be a little more challenging. Very many years ago, I might have tried coding the entire slider in a single class. With my experience in WPF and Silverlight con- trol templates, however, I realize that the Slider control might best be constructed by assembling a Tumb control and RepeatButton controls. It seems reasonable to try to create an XNA Slider com- ponent from similar child components. A Game class adds child components to itself by adding them to its Components property of type GameComponentCollection. Te big question now becomes: Does GameComponent itself have a Components property for its own child components? Unfortunately, the answer is no. However, a component has access to its parent Game class, and if the component needs to instantiate its own child components, it can add those additional components to the same Components collection of the Game class. Do this in the right order and you won’t even need to mess around with the DrawOrder property—defned by DrawableGameComponent—that can help with the order in which components are rendered. By default, the Draw method in the Game class is called frst, then all the Draw methods in the components as they appear in the Components collection. So let’s begin. Like TextBlock, the Thumb class derives from DrawableGameComponent. Te entire Tumb class except for the touch processing is shown in Figure 3. Tumb defnes the same three events as the WPF and Silverlight Tumb controls. In the TextBlock component in Figure 2, the Position property is a point of type Vector2; in Thumb, the Position property is a Rectangle object, and defnes not only the location of the Tumb, but also its rectangular size. The Thumb visuals consist entirely of a white 1 x 1 pixel bitmap that’s displayed at the location and size indicated by the Position property. (Using single-pixel bit- maps stretched to a particular size is one of my favorite XNA programming techniques.) Te Tumb processes touch input but doesn’t move itself. Te Slider component is responsible for handling DragDelta events from the Tumb and setting a new Position property. public class TextBlock : DrawableGameComponent { SpriteBatch spriteBatch; Vector2 textOrigin; public TextBlock(Game game) : base (game) { // Initialize properties this.Color = Color.Black; } public string Text { set; get; } public SpriteFont Font { set; get; } public Color Color { set; get; } public Vector2 Position { set; get; } public HorizontalAlignment HorizontalAlignment { set; get; } public VerticalAlignment VerticalAlignment { set; get; } public Vector2 Size { get { if (String.IsNullOrEmpty(this.Text) || this.Font == null) return Vector2.Zero; return this.Font.MeasureString(this.Text); } } protected override void LoadContent() { spriteBatch = new SpriteBatch(this.GraphicsDevice); base.LoadContent(); } public override void Update(GameTime gameTime) { if (!String.IsNullOrEmpty(Text) && Font != null) { Vector2 textSize = this.Size; float xOrigin = (int)this.HorizontalAlignment * textSize.X / 2; float yOrigin = (int)this.VerticalAlignment * textSize.Y / 2; textOrigin = new Vector2((int)xOrigin, (int)yOrigin); } base.Update(gameTime); } public override void Draw(GameTime gameTime) { if (!String.IsNullOrEmpty(Text) && Font != null) { spriteBatch.Begin(); spriteBatch.DrawString(this.Font, this.Text, this.Position, this.Color, 0, textOrigin, 1, SpriteEffects.None, 0); spriteBatch.End(); } base.Draw(gameTime); } } Figure 2 The TextBlock Game Component If you want a slider in your program, it must be specially created for that job. msdn magazine 84 UI Frontiers Besides the Tumb, my Slider also contains two RepeatButton components, one on each side of the Tumb. Like the WPF and Silverlight RepeatButton controls, these components generate a series of Click events if you hold your fnger on them. In my XNA Slider, the RepeatButton components occupy a particular area of the screen but are actually invisible. (Te thin vertical track down the center is drawn by Slider itself.) This means that RepeatButton can derive from GameComponent rather than DrawableGameComponent. Figure 4 shows the whole RepeatButton control except for the touch processing. At this point, we’re ready to build the Slider component. To keep it reasonably simple, I restricted myself to a vertical orientation. Slider has the usual public properties—Minimum, Maximum and Value, among others—and also defnes a ValueChanged event. In its Initialize override, Slider creates the three child components (and saves them as felds), sets the event handlers and adds them to the Components collection of its parent Game class accessible through its Game property: public override void Initialize() { thumb = new Thumb(this.Game); thumb.DragDelta += OnThumbDragDelta; this.Game.Components.Add(thumb); btnDecrease = new RepeatButton(this.Game); btnDecrease.Click += OnRepeatButtonClick; this.Game.Components.Add(btnDecrease); btnIncrease = new RepeatButton(this.Game); btnIncrease.Click += OnRepeatButtonClick; this.Game.Components.Add(btnIncrease); base.Initialize(); } XNA Game derivatives and GameComponent derivatives have three occasions to perform initialization—the class constructor, an override of the Initialize method and an override of the LoadContent method—and it might be a little confusing when to use which. As the name suggests, LoadContent is great for loading content (such as fonts or bitmaps) and should be used for all other initialization that depends on the existence of the GraphicsDevice object. I prefer creating components and adding them to the Compo- nents collection during the Initialize override, as shown in the previous code snippet. In XnaColorScroll, the Game derivative class also creates six TextBlock components and three Slider components in its Initialize override. When the Initialize method in the base class is called at the end, then all the Initialize methods for the child components are called. Te Initialize method in each Slider then creates the Tumb and two RepeatButton components. Tis scheme ensures that all these components are added to the Components collection in a proper order for rendering. Processing Touch Input Te primary means of user input in most Windows Phone 7 pro- grams is multi-touch, and XNA is no exception. (Keyboard input is also available to XNA programs, and programs can also use the phone’s accelerometer as an input device.) An XNA program obtains touch input during the Update over- ride. Tere are no touch events. All touch input is polled and is acces- sible through the TouchPanel class. Te static TouchPanel.GetState method provides low-level touch input that consists of Touch- Location objects indicating Pressed, Moved and Released activity with position information and integer ID numbers to distin- guish between multiple fingers. The TouchPanel.ReadGesture method (which I don’t use in this program) provides higher-level gesture support. When I frst started working with game components, I thought that each component could obtain touch input on its own, take what it needed and ignore the rest. Tis didn’t seem to work very well; it was as if the game class and the components were all competing for touch input rather than sharing it like polite children. I decided to work out another system for processing touch input. Only the Game class calls TouchPanel.GetState. Ten, each Touch- Location object is passed to child components through a method I call ProcessTouch. Tis allows the components to get frst dibs on the input. A component that makes use of a TouchLocation object returns true from the ProcessTouch method, and further processing for that particular TouchLocation object ends. (Return- ing true from ProcessTouch is like setting the Handled property of RoutedEventArgs to true in WPF or Silverlight.) public class Thumb : DrawableGameComponent { SpriteBatch spriteBatch; Texture2D tinyTexture; int? touchId; public event EventHandler<DragEventArgs> DragStarted; public event EventHandler<DragEventArgs> DragDelta; public event EventHandler<DragEventArgs> DragCompleted; public Thumb(Game game) : base(game) { // Initialize properties this.Color = Color.White; } public Rectangle Position { set; get; } public Color Color { set; get; } protected override void LoadContent() { spriteBatch = new SpriteBatch(this.GraphicsDevice); tinyTexture = new Texture2D(this.GraphicsDevice, 1, 1); tinyTexture.SetData<Color>(new Color[] { Color.White }); base.LoadContent(); } ... public override void Draw(GameTime gameTime) { spriteBatch.Begin(); spriteBatch.Draw(tinyTexture, this.Position, this.Color); spriteBatch.End(); base.Draw(gameTime); } } Figure 3 Most of the Thumb Class Writing simple controls in XNA is easier than you probably think. APRIL 18–22, 2011 LAS VEGAS, NEVADA RIO ALL-SUITE HOTEL & CASINO YOU ARE NOT A CODER: YOU ARE A CREATOR. YOU TAKE AN IDEA AND MAKE IT A REALITY. You turn the “impossible” into “possible.” You bring into being the applications that your company relies on everyday. Attend Visual Studio Live! for real-world education for creators like you. Full-day intensive workshops. Practical how-to sessions. Insightful keynotes from Microsoft MVPs. From Silverlight to Sharepoint, Data Management to Visual Studio 2010, you’ll get the training that will take your code to the next level. REGISTER TODAY SAVE $300! USE CODE ADJAN SUPPORTED BY: WWW.VSLIVE.COM/LV PRODUCED BY: Untitled-1 1 12/2/10 10:33 AM msdn magazine 86 UI Frontiers In XnaColorScroll, the Update method in the main Game class calls TouchPanel.GetState to get all the touch input and then calls the ProcessTouch method in each Slider component as shown here: TouchCollection touches = TouchPanel.GetState(); foreach (TouchLocation touch in touches) { for (int primary = 0; primary < 3; primary++) if (sliders[primary].ProcessTouch(gameTime, touch)) break; } Notice how further processing of each TouchLocation object stops whenever ProcessTouch returns true. Te ProcessTouch method in each Slider control then calls the ProcessTouch methods in the two RepeatButton components and the Tumb component: public bool ProcessTouch(GameTime gameTime, TouchLocation touch) { if (btnIncrease.ProcessTouch(gameTime, touch)) return true; if (btnDecrease.ProcessTouch(gameTime, touch)) return true; if (thumb.ProcessTouch(gameTime, touch)) return true; return false; } If any of these components (or the Game class itself) wished to perform its own touch processing, it would frst call ProcessTouch in each child. Any TouchLocation object that remained unprocessed by the children could then be examined for the class’s own use. In efect, this scheme allows the visually topmost children to have frst access to touch input, which is ofen exactly what you want. It’s much like the routed event handling implemented in WPF and Silverlight. Both the RepeatButton and the Tumb are interested in touch input if a fnger is frst touched within the area indicated by the component’s Position rectangle. Te component then “captures” that fnger by saving the touch ID. All other touch input with that ID belongs to that particular component until the fnger is released. Propagating Events Te touch processing of the Tumb and RepeatButton components is really the crux of the Slider control, but that’s something you can study on your own if you’re interested. When the Thumb component detects finger movement on its surface, it generates DragDelta events; when the RepeatButton component detects taps or sustained presses, it fres Click events. Both these events are handled by the parent Slider compo- nent, which adjusts its Value property accordingly, and this fres a ValueChanged event from the Slider. Te Slider is also responsible for setting the Position properties of the Tumb and two Repeat- Button components based on the new Value. Te Game class handles the ValueChanged events from the three Slider components in a single event handler, so the method simply sets new values of the three TextBlock components and a new color: void OnSliderValueChanged(object sender, EventArgs args) { txtblkValues[0].Text = sliders[0].Value.ToString("X2"); txtblkValues[1].Text = sliders[1].Value.ToString("X2"); txtblkValues[2].Text = sliders[2].Value.ToString("X2"); selectedColor = new Color(sliders[0].Value, sliders[1].Value, sliders[2].Value); } Tat leaves the Draw override in the Game class with just two jobs: Draw the black rectangle that serves as background behind the components, and draw the rectangle with a new selectedColor on the right. Both jobs involve a 1 x 1 pixel bitmap stretched to fll half the screen. Better than Silverlight? I recently also wrote a Silverlight version of the color-scroll pro- gram for Windows Phone 7, and I discovered that the program only allowed me to manipulate one Slider at a time. However, if you deploy the XnaColorScroll program to a phone, you can manipulate all three Slider controls independently. I fnd that diference very interesting. It illustrates once again how high-level interfaces such as Silverlight may make our lives much simpler, but ofen something else needs to be sacrifced. I believe the concept is called TANSTAAFL: Tere ain’t no such thing as a free lunch. XNA is so low-level in some respects that it might remind us of the Wild West. But with a little judicious use of components (including components built from other components) and mim- icking routed event handling, even XNA can be tamed and made to seem more like Silverlight—perhaps even better. „ CHARLES PETZOLD is a longtime contributing editor to MSDN Magazine. His new book, “Programming Windows Phone 7” (Microsof Press, 2010), is avail- able as a free download at bit.ly/cpebookpdf. THANKS to the following technical expert for reviewing this article: Shawn Hargreaves public class RepeatButton : GameComponent { static readonly TimeSpan REPEAT_DELAY = TimeSpan.FromMilliseconds(500); static readonly TimeSpan REPEAT_TIME = TimeSpan.FromMilliseconds(100); int? touchId; bool delayExceeded; TimeSpan clickTime; public event EventHandler Click; public RepeatButton(Game game) : base(game) { } public Rectangle Position { set; get; } ... } Figure 4 Most of the RepeatButton Class The primary means of user input in most Windows Phone 7 programs is multi- touch, and XNA is no exception. .QRZ LWDOO · 1housands of oode samples · 0ver 30,000 artioles · Praotioal answers in aotive forums · MlC/Miorosoft ® visual C++ ® , visual C# ® , visual Basio ® .NL1, A3P.NL1, 3ÇL 3erver ® , windows ® Phone, and more! · ÇuiokAnswers · 7.5 million members · 3tart your lRLL membership 10UA¥! 70 :H·YHJRWDOOWKHDQVZHUV Untitled-4 1 12/9/10 1:38 PM msdn magazine 88 Rooney is one of my longtime infuences. We both graduated from Colgate University, he in 1942 and I in 1979. I’ve sometimes described my goals for this column as: “Tink of it as a cross between Andy Rooney, Dennis Miller and George Will. Or maybe it’s better if you don’t.” I’ve blatantly plagiarized Sol’s and Pete’s and Andy’s idea and made the following observations about the contrasting times in a geek’s life. I hope you’ll use the e-mail link below to send me your own favorites. My friends, there is: A time to slam down a huge jolt of caf- feine, and a time to sip it slowly over the course of an hour. A time to save state in object instances, and a time to make them stateless. A time for bullet-proofing your code and a time for taking short cuts. A time to listen to Clippy (“I see you’re writing a crude forgery”) and a time to shoot him in the head. A time to play Solitaire at meetings, and a time to pay attention to the speaker (rare, unless that speaker is me). A time for bullet points and a time to ditch PowerPoint and actually talk to your audience. A time to chase the cat of your laptop keyboard, and a time to use your desktop and let her sleep. A time to hold up delivery until you can fx a particular feature, a time to chop out the entire feature because you can’t fx it and need to make the ship date, and a time to just ship the damn thing. A time to put your user frst and not stroke your own ego. Tere isn’t an opposite to this one. A time to work until two in the morning, and a time to go home and appreciate still having a family. A time to read brilliant columns, and a time to quit goofng of and get back to work already. Tat’s now. „ DAVID S. PLATT teaches Programming .NET at Harvard University Extension School and at companies all over the world. He’s the author of 11 programming books, including “Why Sofware Sucks” (Addison-Wesley Professional, 2006) and “Intro- ducing Microsof .NET” (Microsof Press, 2002). Microsof named him a Sofware Legend in 2002. He wonders whether he should tape down two of his daughter’s fngers so she learns how to count in octal. You can contact him at rollthunder.com. Turn! Turn! Turn! I was listening to Pandora.com the other day, and its algorithms decided to play me the Byrds classic (1965) folk-rock song “Turn! Turn! Turn!” I know you’ll recognize some of its lyrics: “To everything, there is a season, and a time for every purpose under heaven. A time to be born, a time to die; a time to plant, a time to reap; a time to kill, a time to heal; a time to laugh, a time to weep.” Te words, of course, come from the bib- lical book of Ecclesiastes, specifcally the third chapter. Teir authorship is popularly attributed to Shlomo ben David, Solomon the Wise (although see the debate at bit.ly/ g8Q5I6). If true, an ancient Israelite king would own credit for the lyrics to a Billboard No. 1 hit (though I suspect the copyright has expired, so he wouldn’t get much money for it). Pete Seeger put the words to music in 1959, rearranging some to make the song more singable (“poetic license,” another triumph of usability over strict veracity) and adding six words of his own. Geek that I am, I couldn’t help but realize the pattern that unites the verses of this song. Tey all describe opposite actions and state that a time exists for each. To put it into canonical geek language, there is a time for A and a time for !A (or ~A, if you’re going to get picky). Andy Rooney extended this pattern in his book, “Te Most of Andy Rooney” (Galahad Books, 1991), when he suggested changes to the list: “Tere is a time for putting chocolate sauce on vanilla ice cream, and a time for eating vanilla ice cream without chocolate sauce. … A time for exercise and a time for lying down and taking a little nap. … A time to play basketball, which shall be in the wintertime, and a time to knock it of with the basketball, which shall be as the wintertime comes to an end, and not in June.” DON’T GET ME STARTED DAVID PLATT To put it into canonical geek language, there is a time for A and a time for !A (or ~A, if you’re going to get picky). Untitled-14 1 10/7/10 3:47 PM Untitled-1 1 11/4/10 4:37 PM ......................................................................................... Using Quince™, you and your team can collaborate on the user interface using wireframes, designs and examples. ....................................................................... Then use NetAdvantage® UI controls, like the map control used here, to bring the application to life quickly & easily. .............................................................................................................................. From start to finish, Infragistics gives you the tools to create impressive user experiences that'll make end users happy! . SEE HOW WE USE THE TOOLS TO CREATE THIS KILLER APP AT INFRAGISTICS.COM/IMPRESS Infragistics Sales 800 231 8588 • Infragistics Europe Sales +44 (0) 800 298 9055 • Infragistics India +91 80 4151 8042 • @infragistics Dino Esposito. International $60. Redmond Media Group Matt Morollo Vice President. Technical inaccuracies may result from printing errors and/or new developments in the industry. International $25. CA 91311-9998. Attendee Marketing David F.meritdirect. Send orders with payment to: MSDN Magazine. Coates Vice President. is available for rental.A. Chatsworth.com Matt Morollo VP Publishing . Event Operations Jeffrey S.00. Inc.1105media..S.com Neal Vitale President & Chief Executive Officer Richard Vitale Senior Vice President & Chief Financial Officer Michael J. all others $12. Lindgren Vice President. Reproductions in whole or part prohibited except by written permission. James McCaffrey. 101.com or call 847-763-9560.com/1105 All customer service inquiries should be sent to MSDNmag@1105service. Legal Disclaimer: The information in this magazine has not undergone any formal testing by 1105 Media.com. Return Undeliverable Canadian Addresses to Circulation Dept..S. Irvine. Juval Lowy.” c/o MSDN Magazine. Printed in the USA . 508-532-1418 (phone).magreprints. as well as other lists from 1105 Media. Inc. 310 Paterson Plank Road.. Chris Kourtoglou Regional Sales Manager William Smith National Accounts Director Danna Vedder Microsoft Account Manager Jenny Hernandez-Asandas Director Print Production Serena Barnes Production Coordinator/[email protected] or call (847) 763-9560..00. VP Publishing. Annual subscription rates payable in US funds are: U. POSTMASTER: Send address changes to MSDN Magazine. Implementation or use of any information contained herein is the reader’s sole responsibility. NJ 07072.Inc. and at additional mailing offices. Annual digital subscription rates payable in U.S. David S. P Box 3167. CA 92618. Mail requests to “Permissions Editor. Editorial Director Michele Imgrund Director. While the information has been reviewed for accuracy. IL 60076. Printed in the U. Corporate Address: 1105 Media. E-mail: [email protected]. E-mail: 1105reprints@parsintl. Merit Direct.asp List Rental: This publication’s subscriber list.CA 91311. email MSDNmag@1105service. Publications Mail Agreement No: 40612608. Ste.O. Single copies/back issues: U.Chatsworth.S. CA 91311. Scott Allen.Ste 101. Marketing Tracy Cook Online Marketing Director ADVERTISING SALES: 508-532-1418/mmorollo@1105media. Valenti Executive Vice President Abraham M.com/ QuickQuote. Platt Henry Allain President.00. Information Technology & Application Development Carmel McDonagh Vice President. there is no guarantee that the same or similar results may be achieved in all environments. Myers Vice President. Canada . e-prints. Finance & Administration Erik A. or IMS/NJ.com Reprints: For single article reprints (in minimum quantities of 250-500). IL 60132. Carlstadt. funds are: U. $25. Phone: 914-368-1000.9201 Oakdale Ave.com.JANUARY 2011 VOLUME 26 NUMBER 1 magazine LUCINDA ROWLEY Director DIEGO DAGUM Editorial Director/mmeditor@microsoft. . www.www. Ste.com Media Kits: Direct your Media Kit requests to Matt Morollo. please contact our list manager.com TERRENCE DORSEY Technical Editor DAVID RAMEL Features Editor WENDY GONCHAR Managing Editor KATRINA CARRASCO Associate Managing Editor SCOTT SHULTZ Creative Director JOSHUA GOULD Art Director ALAN TAO Senior Graphic Designer CONTRIBUTING EDITORS K. Audience Development & Digital Media Christopher M. plaques and posters contact: PARS International. Julie Lerman. For more information. Attn: Returns. 508-875-6622 (fax).S. Carol Stream. Inc. Periodicals postage paid at Chatsworth. Dr.O. 130. Publishing Doug Barney Vice President. Charles Petzold. mmorollo@1105media. P Box 2166. 16261 Laguna Canyon Road. Web: www. Langer Senior Vice President. Ted Neward. and is distributed without any warranty expressed or implied. 9201 Oakdale Avenue. $35. Klein Chairman of the Board MSDN Magazine (ISSN 1528-4859) is published monthly by 1105 Media. $10. Skokie.com KERI GRASSL Site Manager KEITH WARD Editor in Chief/mmeditor@microsoft. Phone: 212-221-9595. com/textcontrol New Intel Visual Fortran Compiler by Intel Intel® Visual Fortran Composer XE 2011 includes the latest generation of Intel® Fortran compilers. Silverlight. structured numbered lists.com/LEAD IDM UltraEdit The #1 Best Selling Text Editor in the World by IDM UltraEdit is the world’s standard in text editors. Document and Control Your Software Project by Sparx Systems Enterprise Architect is a comprehensive. integrated UML 2. merge fields. Just Use the Offer Code TRWD01 When You Place Your Order On-line or with Your Programmer’s Paradise Representative. and database platforms. DOC. but is licensed only for development.007.99 41. Named User 1-24 Users Paradise # I84 01201A01 $ VMware vSphere Essentials Kit Bundle vSphere Essentials provides an all-in-one solution for small offices to virtualize three physical servers for consolidating and managing applications to reduce hardware and operating costs with a low up-front investment. high-speed scanning.1 modeling suite providing key benefits at each stage of system development. test. hex. Millions use UltraEdit as the ideal text/hex/programmers editor on any platform — Windows.com/sparxsystems VMware Workstation 7 VMware Workstation 7 is the gold-standard virtualization software for desktop and laptop computers. Find in Files. templates. heterogeneous database connectivity. programmers. document clean up.1 New Word Processing Components Service TX Text Control is royalty-free. targeting a growing number of Single user platforms. T79 02101A02 bullets. Replace.5 supports UML. sort. Adobe Flash and PDF • XCopy deployment • Royalty-Free Licensing for Web and Windows applications programmers. VBScript/HTML. Web. NEW RELEASE! Paradise # CGI 15401A01 $ 1. Pack! robust and powerful word processing software in reusable component form. and loop unrolling. macros/scripting. vectorization.310.com/microsoft TX Text Control 15. WPF. Linux. columns • Ready-to-use toolbars and dialog boxes $ Professional Ed. Paradise # D03 04301A01 $ 1. C++Builder®. providing powerful compiled. and to integrate with Eclipse and Visual Studio 2005/2008. expanding the power of desktop systems for IT administrators. Not responsible for typographical errors. large file handling (4+ GB). rich visual component frameworks.com/embarcadero for Linux & VMware Workstation transforms the way Windows technical professionals develop. advanced compression (CCITT G3/G4. Intel® Visual Fortran Compiler XE 12. $ 99 MODAF. sections. test.com/vmware and improve productivity.NET WinForms control for VB. The license for SQL 2-bit/x64 Server 2008 Developer entitles one developer IA64 DVD to use the software on as many systems Paradise # as necessary.com/intel Microsoft SQL Server Developer Edition 2008 R2 by Microsoft SQL Server 2008 Developer enables developers to build and test applications that run on SQL Server on 32-bit. programmersparadise. XML. NEW VERSION 6! • . 59. and PC enthusiasts. MRC. design. SQL Server 2008 Developer includes all of the functionality of Enterprise Edition. WCF. minimize risk. multithreading. Delphi. VMware Workstation enables users to create and host multiple virtual machines on a single desktop. managed and dynamic language support. column mode. ActiveReports 6 by GrapeCity The de facto standard reporting tool for Microsoft Visual Studio. PDF text import Paradise # • Tables. ASP • File formats DOCX.99 1. technical sales. demo. Work within a personalized with MSDN environment. debugging and deploying applications. 1-4 Licenses test and construct reliable. programmers. powerful Find.com . BPMN and other open standards to analyze. JBIG2.220. and more. Additional plug-ins are SP6 03101A02 also available for Zachman Framework.0 by LEAD Technologies LEADTOOLS Document Imaging has every component you need to develop powerful image-enabled business applications including specialized bi-tonal image processing. RTF. This package delivers advanced capabilities for development of application parallelism and winning performance for the full range of Intel® processor-based platforms and other compatible platforms. and RadPHP™.com/microsoft Win an iPad Place an Order for Software (or Hardware) with Programmer’s Paradise and You’ll be Entered for a Drawing to Win an iPad Wi-Fi 32GB. vSphere Essentials includes: • VMware ESXi and VMware ESX (deployment-time choice) • VMware vStorage VMFS • Four-way virtual SMP • VMware vCenter Server Agent • VMware vStorage APIs/VCB • VMware vCenter Update Manager • VMware vCenter Server for Essentials for 3 hosts Paradise # V55 85101C02 $ 2. and streamline tasks that save time programmers. Download a demo today. Integrated support for Test-First Development and new $ 99 debugging tools let you find and fix bugs quickly and easily to ensure high quality solutions.383. RAD Studio includes Delphi®.0 for Windows. Workstation’s innovative V55 22301A04 features for VMware environments help to $ 99 reduce hardware cost. TXT Professional Edition • PDF and PDF/A export.NET and C# • ActiveX for VB6.com/grapecity Microsoft Visual Studio Professional 2010 Upgrade by Microsoft Upgrade to Microsoft Visual Studio 2010 Professional — the integrated environment that simplifies creating. instances of SQL Server 2008 Developer can easily be upgraded to SQL $ 99 Server 2008 Enterprise without reinstallation. . or Mac! Features include syntax highlighting for nearly any programming language. headers & footers. Paradise # and deploy software. Available add-ons include: • Multi-threaded OCR/ICR/OMR/MICR/ Barcodes (1D/2D) • Forms Recognition/Processing Paradise # L05 03301A01 • Print Capture and Document Writers $ 99 • PDF.99 182. It includes the compiler’s breadth of advanced optimization. and processor support. programmers. Delphi Prism™. HTML. Intel® Fortran Composer XE is available for Linux and Mac OS X. Enterprise Architect 7. Web Viewer. For rapid deployment into M47 31101A04 production. including Microsoft SharePoint and Paradise # cloud applications and accelerate the coding M47 40201B02 process by using your existing skills. programmers. multiple undo/redo. PDF/A and XPS programmers. 149. ABC). Enterprise Architect Corporate Edition Visualize. projects.NET • Fast and Flexible reporting engine • Flexible event-driven API to completely control the rendering of reports • Wide range of Export and Preview formats including Windows Forms Viewer. 866-719-1528 Prices subject to change. training and support staff. and demo use. NEW RELEASE! for Windows Single (SSR) Paradise # I23 86101E03 $ 263. as well as automatic processor dispatch. with the richest feature set and broadest platform support available. PHP and the Web.com/vSphere Embarcadero RAD Studio XE by Embarcadero Embarcadero RAD Studio XE is a comprehensive application development suite and the fastest way to visually build GUI-intensive.NET. ia64. save time. programmers. programmers. .95 446. FTP support. Unicode. and x64 platforms. DoDAF and TOGAF. & WF. annotations. and a vast 3rd party ecosystem – enabling you to deliver applications up to 5x faster across multiple Windows. data-driven applications for Windows. Unleash your creativity and bring your vision to life with powerful design surfaces and innovative collaboration methods for developers and designers.com/idm programmers.NET. SysML.99 488. software development and test engineers.Your best source for software development tools! ® LEADTOOLS Document Imaging SDK v17. and Win32/64 binaries for C/C++. and Replace in Files. text frames.99 programmers. well underParadise # stood systems. mspx. It’s now 2011. and therefore it is not necessary to ‘climb up a tree’ all the way again before you can go down to a different but related subject. When you select a reference. you are permitted to physically transfer this paper copy in unmodified form. is why it’s so cool: “The texts are linked together in a way that one can go from one concept to another to find the information one wants. or introduce into a retrieval system MSDN Magazine or any part of MSDN Magazine. among other things.org/Proposal. don’t ask “What if . MSDN.com/magazine.” Systems integration never sounded so simple. it’s like reading the U. did it? The key component to tie the information together is described here: “A program which provides access to the hypertext world we call a browser. The recommendations and technical guidelines in MSDN Magazine are based on specific environments and configurations. Nov. Inc. months or years? Or will you invent something? You can’t invent the Web. 12.com. Inc. Back in 1989-1990. The year ahead looms. Berners-Lee was looking for a way to more easily share information around the massive physics lab at which he worked. Constitution—this document established a framework for a new way of doing things. express or implied.html). Microsoft Corporation is solely responsible for the editorial contents of this magazine. computer documentation and on-line systems help. Here’s a pretty nice summation of their plan. MSDN Magazine is published by 1105 Media. don’t aspire to make a bigger impact. data-bases.” And the rest is history. you are not permitted to transmit copies of MSDN Magazine (or any part of MSDN Magazine) in any form or by any means without the express written permission of Microsoft Corporation. Otherwise. is an independent company not affiliated with Microsoft Corporation. Can you make the world better with your software. But if you don’t break through boundaries. This forming of a Visit us at msdn. of course. was when Tim Berners-Lee detailed his plan for a “Hypertext Project” ( w3.0/trademarks/en-us. if you like. But it did have a beginning. they would appear in a numbered list and selection would be done by entering a number). reading it can raise goosebumps. When starting a hypertext browser on your workstation. my friends. has no end. The network of links is called a web. you will first be presented with a hypertext page which is personal to you: your personal notes. The point of our little trip in Doc Brown’s DeLorean back to the Web’s beginning: You never know where an idea will lead. The web need not be hierarchical. If you have purchased or have otherwise properly acquired a copy of MSDN Magazine in paper format.microsoft. comments or suggestions for MSDN Magazine? Send them to the editor: mmeditor@microsoft. in a nutshell. MSDN Magazine. but of course what came out of it is not. Without limiting the rights under copyright.. The point of our little trip in Doc Brown’s DeLorean back to the Web’s beginning: You never know where an idea will lead.” And here. notes. Berners-Lee had a specific problem in mind that he wanted to solve: “There is a potential large benefit from the integration of a variety of systems in a way which allows a user to follow links pointing from one piece of information to another one. with respect to any code or other information herein and disclaims any liability whatsoever for any use of such code or other information. you are not permitted to reproduce. Such references are highlighted and can be selected with a mouse (on dumb terminals. A listing of Microsoft Corporation trademarks can be found at microsoft. but you can invent a way to do it better. or are you wanting to do more than you’ve done? Stay thirsty.S. even if it’s only a tiny little corner of it? In other words—are you satisfied. Complying with all applicable copyright laws is the responsibility of the user.EDITOR’S NOTE KEITH WARD Change the World The Internet. and Microsoft logos are used by 1105 Media. Microsoft Corporation does not make any representation or warranty.com/library/toolbar/3. 1105 Media. the browser presents you with the text which is referenced: you have made the browser follow a hypertext link. Inc. A hypertext page has pieces of text which refer to other texts. under license from owner. These recommendations or guidelines may not apply to dissimilar configurations. For geeks like me.” web of information nodes rather than a hierarchical tree or an ordered list is the basic concept behind HyperText. Questions. Will you be content with doing the same old thing you’ve been doing for weeks. © 2011 Microsoft Corporation. Other trademarks or trade names mentioned herein are the property of their respective owners. store. 4 msdn magazine . That text itself has links to other texts and so on.. a first-class software engineer. HyperText provides a single user-interface to many large classes of stored information such as reports. The idea was actually pretty basic. 1990. All rights reserved. Potentially. you surely won’t.?” enough. Being. How can you not get excited while reading this description? “HyperText is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. In a way. . Team Foundation Server Power Tools September 2010 Don’t feel left out if you’re using Visual Studio Team Foundation Server (TFS). there’s a good chance you sling code for a living. PowerCommands 10.vssettings files that specify code colorization. You get 25 features in the package. The package also includes the ability to format your code. You can also save themes and share them with your friends. Still. which is—because you’re reading this magazine—probably Visual Studio. templates and plug-ins..ly/hyUNqo) gives you 11 new features that include check-in policies Solution Navigator in Productivity and item templates. for instance. and in many cases it’s getting good at doing that.es) is a Web site that lets you download. they include robust copy and paste enhancements (copying entire classes.ly/hUY9tT).ly/fPKKEV) lets you customize all of the environment colors used in the IDE. is a grab bag of useful extra tools that will speed up or simplify common tasks in the IDE. too. And if you sling code for a living. that the Express versions of Visual Studio don’t support extensions. for example). This extension (bit. Visual Studio 2010 provides robust support for extensibility via custom tools. We’ll cover a few of the most popular free extensions for Visual Studio 2010. Visual Studio 2010 is already an incredibly versatile coding tool. Added bonus: These themes can be used with Visual Studio 2010.. 2008. chances are there’s an extension that helps you customize the IDE or provides the tools you need to write code better and faster. that much easier. Microsoft has a set of Power Tools for you.0 PowerCommands 10. extensions out there.) If you can’t find the feature you need in Visual Studio.ly/g4fUGG) is a package of 15 handy features Visual Studio Color Theme Editor It may not sound that range from Solution Navigator (think Solution Explorer on steroids) as glamorous. but sometimes it’s the little details that make coding to tab autocompletion and highly configurable enhancements to tabs. too. (Note. sort using statements and remove unused using references when saving. TFS Power Tools command-line tools and Windows Powershell cmdlets. Visual Studio can’t do it all out of the box. team member management. That’s where extensions come to the rescue.0 (bit. It does pretty much everything except write the code for you.TOOLBOX TERRENCE DORSEY Visual Studio Tools and Extensions Because you’re reading this magazine. a process editor. . tabs and menus. create and share the .ly/aopeNt). Do brighter colors cheer your mood? Are you particularly fond of magenta? Whatever you prefer. Visual Studio Color Theme Editor (bit. like Productivity Power Tools. StudioStyles (studiostyl. however. Take the colors used in the Visual Studio windows. Windows shell integration and Power Tools for Visual Studio There are thousands of automated database backup. you probably spend a lot of time in your IDE . and it just so happens that one of the most robust extensions was created by the Visual Studio team. Visual Studio 2010 Productivity Power Tools (bit. StudioStyles An even more StudioStyles 6 msdn magazine personal choice is the colorization used for the code itself in your editor. 2005 and even the Express versions. so check that out for details (bit. Scott Guthrie explains how each of the features in Productivity Power Tools works on his blog. syncfusion.Essential Studio Enterprise Edition: 1-888-9DOTNET | www.com . google. You’ve already got the tools . You can read his blog at terrencedorsey.microsoft. The Spell Checker extension (bit. NuGet (nuget.msdn. can be found on the Visual Studio Team blog (blogs.com) and TFS. The Spell Checker extension (bit. It works in any plain-text files.com or follow him on Twitter at @tpdorsey. there are a couple of Visual Studio extensions that incorporate the TortoiseSVN functionality into the IDE (tsvnaddin. custom controls and extensions are available through the Visual Studio Gallery (visualstudiogallery. It also works in the Output. If you’re using Apache Subversion (subversion. start coding! Terrence Dorsey is the technical editor of MSDN Magazine. There’s a tool for that. and if that’s your preferred repository.codeplex. for comments and strings in source code. Many are free.NET Framework developers the ability to easily 8 msdn magazine NuGet Inspired by RubyGems and similar package-management Write Your Own Extensions Don’t see what you need in the Visual Studio Gallery? Write your own! Visual Studio 2010 includes deep hooks for extensibility—anything from a custom project template to third-party tools that integrate directly with the IDE.codeplex. TortoiseSVN Add-in for Visual Studio So you’ve written and tested your code. you probably need to NuGet commit your source to a repository.com/b/jaredpar/). and there are trial versions available for many of the commercial products. incorporate libraries from source-code repositories directly into their local development projects.tigris. You can follow the progress of VsVim (bit.TOOLBOX WordLight Do you ever want to quickly find all the places you’ve used a method or variable name? WordLight (code. too.com/b/visualstudio/). you’ll need to add a layer such as systems from the Linux development world. If you’ve chosen sides in that debate. yet find yourself using Visual Studio. the Spell Checker is a lifesaver.com/p/gitextensions) includes shell extensions for Windows Explorer and a Visual Studio plug-in. and it If y0u tpye lke I do. Since those early days. Command and Immediate windows. and for non-tag elements of HTML and ASP files.ly/aMrXoM) looks for errors in the non-code portions of your files. Thousands of templates. Toolbox . Emacs and Vim Emulation In the beginning there was vi. Emacs and Vim have battled for supremacy as the One True Editor among coders. Another popular source-code management system is Git (git-scm. Through the Extending Visual Studio developer center (msdn. Git Extensions (code.com/p/ wordlight) is a simple extension for Visual Studio 2008 that lets you select some text and instantly highlights all other occurrences of that string in the code file.ly/e3GsMf) developer Jared Parsons via his blog (blogs.com). MSDN Library articles and other resources in the Visual Studio community (bit.org).com/vstudio/ vextend). SvnBridge (svnbridge.codeplex. If you’re working on a team or open source project. A Gallery of Extensions This is just the tip of the iceberg as far as VsTortoise When using TFS. then there’s an extension for you.com).ly/eXhaIK). then rejoice! The keybindings and many other features you know and love from Emacs and Vim are now available in extensions for Visual Studio.google. and more are being added all the time. Spell Checker If y0u tpye lke I do.com) that translates APIs between Subversion clients like TortoiseSVN (vstortoise. the Spell Checker is a lifesaver.microsoft.com). and you can also run NuGet from the command line or via Windows PowerShell cmdlets.codeplex. you can run most features from the command line. along with lots of other great tips.. NuGet integrates with the Visual Studio 2010 IDE..org) source control along with a TortoiseSVN client for Windows (tortoisesvn. Visual Studio extensions are concerned.com/) gives Microsoft . Plus.apache.msdn.msdn.ly/aT1bDe).ly/aMrXoM) looks for errors in the non-code portions of your files. you’ll find a vast amount of information to start creating custom Visual Studio extensions. More information about Emacs emulation (bit. was difficult to learn. saving you many steps in the commit process. . what if you’re not are two examples. I’ll explore the fluent API and alternative by summing up what I discussed last month. the pointcut was only defined ably very close to the needs of many Target Type Interceptor Object through an interface. In AOP jargon.0 offers a Microsoft . example of interception that’s probSecond. worse yet. last month’s code I made some assumptions and used some default components. Application Code Application Code duced the interception mechanism used That quick demo made a few assumpin the Unity 2.ThroughProxy<ICalculator>(calculator. One day. An interface-based pointto be able to run some additional code around the existing code? cut means that only the members of the interface will be extended AOP was created to provide an approach that isolates core code at runtime via code injection. mance hit. You The way in which an AOP framework gains control over the grab the source code. I’ll begin In the rest of this article.NET and Unity Imagine a slightly different scenario. at some point. too.interceptor must be of type TransparentProxyInterceptor: 10 msdn magazine . 2)). To be interceptable. You need to investigate and fix the problem and you want it You end up working with an interceptable proxy that wraps your to be absolutely unobtrusive. imagine that someone reports a bug or a serious perforConsole. pointcut represents a collection of places Have you ever wanted to extend the in the body of the target class where the behavior of an existing piece of code without having to touch the source Figure 1 Instance Interceptor and Type Interceptor framework will inject extra behaviors code in any way? Have you just wanted on demand. in ways of defining Unity interceptors. you must take some control over the factory. container.0. from other concerns that crosscut the core business logic. Finally. But wouldn’t it explicit calls that return a proxy for the original object or keep the be better if you could add new behaviors seamlessly and without whole thing running behind an IoC framework. In other words. To fully comprehend the purpose of this follow-up article. If the class derives from MarshalByRefObject. however. new InterfaceInterceptor(). AOP in Unity AOP is not pure magic and you’ll never be able to hook up a plain Imagine you’ve a deployed application where. an independent consultant. you’d Calculator class interceptable: heartily welcome any solutions that would let you add new behaviors var calculator = new Calculator(). you CLR class instantiated via the standard new operator: perform some business-specific action. Unity Third. modify it. a class AOP helps you in all of these scenarios. outside your current project and.CUTTING EDGE DINO ESPOSITO Interceptors in Unity In last month’s column. implements the ICalculator interface.0 dependency injection tions. First. In this case. but work full time for the company? easy and effective coding experience. your customer var calculator = new Calculator(). I mostly focused on the configuration settings to enable 2. After illustrating the core First. you risk creating new Here’s some basic code that makes an existing instance of the (and not necessarily required) branches of your codebase. This month I’ll step back and discuss in more detail Interceptable Instances the choices and options that you typically encounter along the way. it worked on a type registered Actual principles of aspect-oriented programwith the Unity Inversion of Control Instance Interceptor Target (IoC) infrastructure and instantiated ming (AOP). the result is a seamless. I presented a concrete via the Unity factory layer. To add a new behavior to an existing or freshly created instance of a class.and post-processing Object. seamlessly and with no exposure to the source code. and bill a few hours of consulting instance may differ quite a bit. it would be better to add original object.Sum(2. In Unity. I briefly introing the interception API in Unity 2. Spring. When AOP meets IoC. touching the existing source code? most IoC frameworks offer AOP capabilities. configure Unity in code. For this reason. you can resort to some time for coding up and testing the new features.WriteLine(calculatorProxy. I’m assuming that the Calculator class new behaviors seamlessly and with no exposure to the source code.NET Framework 4-based framework for interception and disregarded the fluent API that allows you to achieving this in a surprisingly quick and easy way. then the code around an existing method without touching it by leverag. the more hours you spend Let’s start with an example where no IoC capabilities are used. new[] { new LogBehavior() }). So. The more requests for change you get. asks to extend that behavior to perform some more work. a developers today. must either implement an interface or inherit from MarshalByRefLast month I demonstrated how to add pre. In this case. var calculatorProxy = Intercept. As you’ll find out. . 2)). In case of instance interception. The Intercept class also offers a NewInstance method you can call to create an interceptable object in a more direct way. Needless to say. The bootstrap code will instruct the container how to resolve types around the application and how to deal with interception. Any code needed to bootstrap the container can be isolated in a distinct class and invoked once at application startup. any interceptable action necessarily follows the creation of the instance.Resolve<ICalculator>(). The actual type is derived by Unity on the fly and incorporates interception logic (see Figure 1). It’s a bit less obvious when interception is applied to a newly created instance. the interception only happens if the call goes through the proxy. then that self-call can be intercepted because the interception logic is in the same object. I used the IoC container of the Unity library to take care of object creation. then works with that instance. Using the IoC Container In last month’s example. More importantly. However. any public and protected virtual methods are overridden to support interception. This is even truer if you consider IoC frameworks with additional AOP capabilities. In doing so. is not of the original type. However. let’s start with an example that uses the Unity’s container with code-based fluent configuration: // Configure the IoC container var container = UnityStarter. with instance interception. you have a single place to configure Unity. Type interceptors generate a new class derived from the type being intercepted. the interceptor component has to be slightly different—neither InterfaceInterceptor nor TransparentProxyInterceptor. So how many types of interceptors exist in Unity? Instance and Type Interceptors An interceptor is the Unity component responsible for capturing the original call to the target object and routing it through a pipeline of behaviors so that each behavior has its chance to run before or after the regular method call. but it can intercept the widest set of methods. Self-calls. So if you need AOP Cutting Edge . but rather a VirtualMethodInterceptor object. InterfaceInterceptor and TransparentProxyInterceptor are two Unity interceptors that belong to the instance interceptor category. VirtualMethodInterceptor can intercept virtual methods both public and protected. It should be noted that instance interception can be applied to any public instance methods. InterfaceInterceptor can intercept public instance methods on just one interface on the target object.WriteLine(calculatorProxy. the application code first creates the target object using a classic factory (or the new operator). Consider the following code: var calculatorProxy = Intercept.Sum(2. An IoC container is an extra layer around object creation that adds flexibility to the application. Type interception uses dynamic code generation to return an object that inherits from the original type. of course. The implementation of instance interception is such that the constructor has already executed by the time the application code gets back an object to work with. (You can’t create the object directly with the new operator and get type interception. As a result.ThroughProxy(calculator. new[] { new LogBehavior() }). the delta between the original and derived type is all in the logic required to filter incoming calls. The nice thing is that this code can be moved to a separate assembly and loaded or changed dynamically. new[] { new LogBehavior() }). This is fairly obvious for the scenario in which interception applies to an existing instance. If a method is calling another method on the same object when using type interception. and marshal-by-reference objects. The interceptor can be applied to new and existing instances. In the case of type interception. and work on an instance of that derived type. Instance interceptors create a proxy to filter incoming calls directed at the intercepted instance. Figure 3 shows a possible implementation of the bootstrap code.NewInstance<Calculator>( new VirtualMethodInterceptor(). } } Note that when you use NewInstance. 2). don’t go through the proxy and therefore no interception happens. intercepting calls by the object on itself. Int32 y) { return x + y. It’s also worth noting that there are some other significant differences between instance and type interception—for example. The Calculator class looks like this: public class Calculator { public virtual Int32 Sum(Int32 x. // Start the application var calculator = container. var result = calculator. and then is forced to interact with it through the proxy provided by Unity. var calculatorProxy = Intercept.Figure 2 Actual Type After Type Interception var calculator = new Calculator(). This won’t happen as long as you stick to the Intercept class that behaves like a smart factory and needs to be prepared every time you use it.) The target object. The interceptor can only be applied to new instances. This is the slowest approach for interception. Furthermore—and as I see things—the level of code flexibility grows beyond imagination if you also combine IoC containers with offline configuration. VirtualMethodInterceptor belongs to the type interceptor category.Sum(2. The interceptor can be applied to new and existing instances. Interception can be of two types: instance interception and type interception. TransparentProxyInterceptor can intercept public instance methods on more than one interface. Here’s how to use it: var calculatorProxy = Intercept. Console. but not to constructors. new TransparentProxyInterceptor(). the application creates the target object through the API or Unity.Initialize().NewInstance<Calculator>( new VirtualMethodInterceptor(). however. new[] { new LogBehavior() }). 12 msdn magazine Figure 2 shows the actual name of the type that results from a dynamic inspection of the calculatorProxy variable. A call to the Resolve method shields you from all the details of interception. . Here I registered two behaviors for the ICalculator type. result. by all means get it via an IoC container. Unity’s defaults will automatically work for them.com/practices/2010/unity"> <assembly name="SimplestWithConfigIoC"/> <namespace name="SimplestWithConfigIoC. Figure 4 Adding Interception Details Via Configuration <unity xmlns="http://schemas. 2010) and coauthored “Microsoft . A behavior is made of three methods.config file (or web. container.NET MVC” (Microsoft Press. behaviors are objects that actually implement the crosscutting concerns. the behavior won’t execute. getNext).and post-processed by LogBehavior and BinaryBehavior. because LogBehavior and BinaryBehavior are concrete types.ToBinaryString().Invoke(input.MethodBase. // Register ICalculator with the container and map it to // an actual type. 2008). Hence.Interception. } public bool WillExecute { get { return true.EmptyTypes.. In this case. // Transform var binaryString = ((Int32)result). so your behavior will in fact execute even if you returned false.RegisterType<ICalculator. The parameter getNext is the delegate for you to move to the next behavior in the pipeline and eventually execute the method on the target. Esposito is a frequent speaker at industry events worldwide. GetNextInterceptionBehaviorDelegate getNext) { // Perform the operation var methodReturn = getNext().Configuration" /> <container> <extension type="Interception" /> <register type="ICalculator" mapTo="Calculator"> <interceptor type="InterfaceInterceptor"/> <interceptionBehavior type="LogBehavior"/> <interceptionBehavior type="BinaryBehavior"/> </register> <register type="LogBehavior"> </register> <register type="BinaryBehavior"> </register> </container> </unity> Figure 4 shows the script you need to have in the configuration file. } } Next month I’ll resume from here to discuss more effective ways to apply interception and define matching rules for intercepted methods.Unity.WriteLine("Rendering {0} as binary = {1}".Practices.asp.Practices.Name == "Sum") { .LoadConfiguration(). all you can do is run through a bunch of IF statements to figure out which method is actually being invoked. if all the behaviors registered for the type have WillExecute set to false. This means that any calls to public members of the interface will be pre.microsoft. // Enable interception in the current container container. return methodReturn. This is actually a bit subtler. In addition.Calc"/> <namespace name="SimplestWithConfigIoC. What if you want to use more specific matching rules? With plain interception as I described in this article. Invoke will always be called. like so: if(input.AddNewExtension<Interception>(). A class that implements the IInterceptionBehavior interface. You can join his blog at weblogs. Calculator>( new Interceptor<VirtualMethodInterceptor>(). Note that. } } public IMethodReturn Invoke( IMethodInvocation input. Microsoft.config if it’s a Web application). The same solution can be implemented in an even more flexible way by moving the configuration details off to the app.. then the proxy itself won’t be created and you’ll be working with the raw object again. a behavior rewrites the execution cycle of the intercepted method and can modify method parameters or return values. you actually don’t need to register them at all.Unity. Figure 5 shows a sample behavior that intercepts the method Sum and rewrites its return value as a binary string. D INO E SPOSITO is the author of “Programming Microsoft ASP. InterceptionConfigurationExtension. when the proxy or derived type is being created. THANKS to the following technical expert for reviewing this article: Chris Tavares Cutting Edge 14 msdn magazine . write it out Console. If it returns false. } } in your applications. } Figure 5 A Sample Behavior public class BinaryBehavior : IInterceptionBehavior { public IEnumerable<Type> GetRequiredInterfaces() { return Type. The parameter input gains you access to the method being called on the target object. However. The method WillExecute is simply a way to optimize the proxy. It’s really about optimizing proxy creation.Configuration. new InterceptionBehavior<LogBehavior>()). InterceptionExtension.Figure 3 Bootstrapping Unity public class UnityStarter { public static UnityContainer Initialize() { var container = new UnityContainer(). Behaviors can even stop the method from being called at all or call it multiple times.ReturnValue. Behaviors In Unity. // For example. return container. Based in Italy. Note that all intercepted methods on the target object will execute according to the logic expressed in Invoke. // Grab the output var result = methodReturn. container. the bootstrap code consists of the following two lines: var container = new UnityContainer(). binaryString). interfaces returned from this method will be added to the proxy.net/despos. The GetRequiredInterfaces method allows the behavior to add new interfaces to the target object.Behaviors"/> <sectionExtension type="Microsoft. specify interception details. The Invoke method determines the actual logic used to execute a call to a public method on the target object. the core of a behavior is the Invoke method.NET: Architecting Applications for the Enterprise” (Microsoft Press. . 2. the data flow can be pretty easy—taking the form of pushing catalog-type data (menu. I’m taking a look at the challenge again. and I’ve found that it has a similar problem of synchronizing data between nodes. shared terminal. inclusion of various devices for collecting and displaying data. Much like retail chains. what have changed are the additional requirements that get added to the mix as technology becomes more advanced. warehouse and so on) down and t-logs back up—to quite complex. real-time loss-prevention analysis. Bidirectional data synchronization between corporate and nodes (for example. 4. 3. but in today’s world that solution would only represent the substrate of a more robust solution. From a latent satellite connection on an offshore oil platform to an engineer in an oil field. Defining the Solution Architecture Generally. solving the data flow problems from a decade ago would be simple. evaluation and adjustment of equipment in use. I was heavily involved in the retail industry. and the only two criteria that would force it are the need for real-time updates between nodes or the ability to sync branches if the corporate data store isn’t accessible. real-time information is needed between nodes. but this time with a little help from SQL Azure and the Sync Framework. Type 4 is by far the most complex problem and typically leads to many conflicts. Therefore. with both retail and O&G in mind. Rather. Two one-way pushes on different data. and closer-to-real-time feeds. rig. the branches (for example. I’ll still give a few code examples for setting up synchronization between nodes and SQL Azure and filtering content as a way to reduce traffic and time required for synchronization. the requirement for timely. catalog) and one from branch to corporate (for example. In most cases. While the core problem hasn’t changed. we start adding things we’d like to have. For the most part. For retail. I’ll focus a little more on the general architecture and a little less on the implementation. specific equipment and more). one from corporate to branch (for example.FORECAST: CLOUDY JOSEPH FULTZ Branch-Node Synchronization with SQL Azure In my years prior to joining Microsoft and for the first few years thereafter. manual product entry or employee information). I’ll discuss how the cloud will help solve the problem of moving data between the datacenter (corporate). but this isn’t generally the case for data synchronization. I try to avoid this pattern. Peer synchronization between branches and between branches and corporate. such as increasing the data volume to get more detail. In some cases. During this time. but they have some added complexities related to the operation. I found it almost humorous to see how many times the branchnode synchronization problem gets resolved as technologies advance. Because near-real-time or real-time updates among too many nodes would generally create too much traffic and isn’t typically a reasonable solution. accurate data doesn’t change. The distribution to workstations. Let’s face it. but this isn’t generally the case for data synchronization. O&G companies have the same patterns. whereas the . and manual product entries from the branch to corporate and between branches. and a different tack is taken to address the need. In next month’s column. real-time information is needed between nodes. the more we want. O&G companies have a wide range of devices and connectivity challenges. I’ve had a fair amount of exposure to the oil and gas (O&G) industry. So. the only criterion to which I really pay attention is the ability to sync without the master. In some cases. the most prevalent pattern that I see is to push data directly from the corporate master database (via a distributor of some type) down to servers at the branches and to mobile users. Instead of solving the simple problem of moving data between nodes. by frequently updating inventory levels. In my current role. This month. t-logs and inventory). 16 msdn magazine I think about synchronization in the following ways to get a rough idea of the level of complexity (each one is subsequently more complex to implement and support): 1. store. this includes branch to branch—the focus is on the fact that it’s basically two or more one-way syncs. hub and so on) and to individual devices (handheld. I’ll examine using a service-based synchronization approach to provide a scalable synchronization solution beyond splitting data across SQL Azure databases based on content or geographic distribution. it’s an event notification scenario. the more we have. point-of-sale (POS) terminals and other such devices is typically done from the server at the branch location (commonly called “back-of-house servers”). Push read-only data from corporate to branch and onward. . . while in parallel. This gives Workstations and POS the benefit of initiating synchronization from the node. Knowing the endpoints. Consider that in the first scenario. the API semantics are really one pair of others build processes to handle the distribution and collection of endpoints and one scope at a time. as mentioned earlier. laptops) is done side. As a point of design. This could easily lead to lost data due to losing the connection strings are defined for a sync agent to run. by potentially having to debug the sync process or agent at each 3. If the application is designed to sync from the master or distributor to the nodes. I tend to start with a mixed model of synchronization by initiating from the master data store to SQL Mobile Users Azure and subsequently initiating from the nodes to sync between Figure 2 Base Architecture Using SQL Azure and the Sync Framework msdnmagazine. in place of replication. I simply add a layer for example) the complexity is diminished for the synchronization code. Additionally. High availability of the data without additional cost on the target device and could complicate support and maintenance and effort. they’ll be working off of old data until corporate Workstations and POS is back up. chronization process (see Figure 1). 3. it will put a bit more complexity into the applications 2. Complexity in the synchronization process in order Some organizations do this via the built-in replication features to make the synchronization of multiple nodes happen of the relational database management system (RDBMS). master data store. on the downsynchronization to mobile users (for example. although the Agent process would provide a means for a given application to initiate synchronization of its data. because it isn’t the device. Distributor Also. which supports SQL Azure. if the corporate connection a separate process should be created that manages the synchronior data store is down. This sync agent would exist externally to the applications that are the data conSync sumers on the devices. if the branches have common data (for example. I’ll maintain the pattern. datacenter footprint. by segmenting the flow of data through multiple SQL Azure databases. Potentially less security sensitivity. it puts some technical strains on the implementation. if data must be synchronized for different applications. data. Thus.com January 2011 17 . I enjoy the advantage of Figure 1 Typical Architecture for Data Distribution fewer points of management and support. • Can focus on the scope(s) for the application/device. all of the clients will have to hold on to their zation based on a configuration file where scope. Knowing the appropriate scopes for each target. SQL Azure addresses this scenario by automatically making Corp Data Back-of-House Servers copies of the data and providing automatic failover. I must consider the impact of initiating synchronization from the branches or from Mobile Users the server. but use an instance of SQL Azure By initiating synchronization from the target (node or branch.device while waiting for the connection or due to simply running out of space on the device to store the transactions. Using the Sync Framework. requiring: 1. as it: between the distributor and the branches (see Figure 2). However. in place of the distributor. frequency and transactions. I can reduce the risk of being down and reduce load exposure further by using not only a separate instance but also different datacenters. Ideally. directly from corporate to the machine via a client-initiated syn2. I’ll use the Sync Framework. because it’s rolled up into a single process. but it also Corp Data Back-of-House/ SQL Azure reduces the support and mainteDistributor Branch Servers nance aspect. warehouse inventory data). What do I get by inserting SQL Azure? Some of the benefits in • More easily handles being occasionally connected. a branch-node scenario are: • Only has to know and manage a few endpoints by which 1. Scaling the data service without having to grow the the distribution data is synchronized. While there’s no perfect solution. . . What’s more. From any browser.Tables. Keeping Data in Sync SQL Azure Data Sync provides a central cloud-based management system for all synchronization relationships. were quite difficult to build. I look for scenarios that invalidate and force a modification to the design. one might say that data is pushed from master and pulled from the branches. or VPNs. SqlConnection onPremiseConn = new SqlConnection(LocalConnectionString). I think about the costs and benefits of moving the synchronization process control from one point in the chain to another (for example.Add("contact"). Similarly. columnsToInclude. With SQL Azure Data Sync.Add("address"). I like to start with the model described earlier and either synchronize multiple one-way syncs to accomplish something similar to bidirectional data sync. this is easy because administrators create “Sync Groups” that define data to be shared among databases. For more information and to register for participation in the CTP 2.Add("city"). Solutions leveraging SQL Azure Data Sync will allow users to continue to access local data and have changes seamlessly synchronized to SQL Azure as they occur. Senior Program Manager. administrators can connect to the public service and manage and monitor the various database endpoints. allowing for phased migrations of applications to the cloud. any changes made by applications to SQL Azure are synchronized back to the on-premises SQL Server. synchronization between SQL Server and SQL Azure databases provides for a wealth of new scenarios that. This update provides organizations the ability to easily extend on-premises SQL Server databases to the cloud. columnsToInclude.Add(authorsDescription). At the Microsoft Professional Developers Conference 2010.Figure 3 Creating a Synchronization Scope SqlConnection azureConn = new SqlConnection(AzureConnectionString).Add("au_fname"). we also introduced a new component called the SQL Azure Data Sync Agent. SQL Azure Data Sync provides a scheduling service that allows synchronization to take place as often as every five minutes. The Agent’s job is to monitor and log tasks as well as to initiate synchronization requests from the SQL Azure Data Sync. columnsToInclude. While there isn’t a one-for-all design. // Add the authors table to the sync scope authorsScopeDesc. columnsToInclude. all of the remote or regional SQL Server databases can synchronize data changes. enabling them to bring data closer to the users.Add("phone"). columnsToInclude. or use bidirectional synchronization between the device/corporate database and SQL Azure. SQL Azure Data Sync SQL Azure Data Sync is a cloud-based service hosted in Windows Azure that enables synchronization of entire databases or specific tables between SQL Server and SQL Azure. SQL Azure Data Sync Forecast: Cloudy the node and SQL Azure. New Scenarios With SQL Azure Data Sync. onPremiseConn).Add("state"). —Liam Cavanagh. // List of columns to include Collection<string> columnsToInclude = new Collection<string>(). please visit microsoft. In addition. Restated. meaning there are no requirements from a firewall or securityconfiguration standpoint—which makes setup a snap.Add("au_id").GetDescriptionForTable("authors". Based on the needs and constraints of the solution being considered. This agent is a Windows Service that’s installed on-premises and links the local SQL Server databases to SQL Azure Data Sync through a secure outbound HTTPS connection.Add("zip"). or less frequently if the preference is to run synchronization at off-peak times.Add("au_lname"). In the recent SQL Azure Data Sync CTP 2 update. After which. Setting up Synchronization There are two methods to set up synchronization using Sync Framework 2. In addition. columnsToInclude. while also greatly reducing bandwidth and requirements for Virtual Private Networks. These Sync Groups could contain a corporate SQL Server that synchronizes data to a centralized SQL Azure “Data Hub. with SQL Azure becoming the highly available central hub between the master and branches. the only synchronization style I attempt to avoid is peer-to-peer. device to cloud or corporate to cloud).aspx. each of these questions has multiple layers that need to be considered and against which possible solution designs must be vetted.1: sync client in the cloud and sync client on the local 20 msdn magazine . Imagine you wanted to share data with your branch offices or retail store databases. columnsToInclude. Just a few of these considerations are questions such as: • Is there a place to host the process for master? • Are there security policies that conflict with hosting the sync process in SQL Azure? • How many nodes at each level are synchronizing? • Can the target device realistically support a sync process? • What’s the requirement with regard to timeliness of data sync? SQL Azure becomes the highly available central hub between the master and branches.” Then. Generally.com/en-us/sqlazure/datasync. columnsToInclude. we announced an update to this service called SQL Azure Data Sync Community Technology Preview (CTP) 2. in the past. columnsToInclude. Why not synchronize some of that data to your SQL Azure databases around the world when needed? Then users could access the data closest to them while reducing scalability requirements on your local SQL Server. // Create a scope and add tables to it DbSyncScopeDescription authorScopeDesc = new DbSyncScopeDescription(ScopeName). // Definition for authors from local DB DbSyncTableDescription authorsDescription = SqlSyncDescriptionBuilder. from this Data Hub. Imagine you have quarterly reporting requirements that put a huge cyclical load on your SQL Server database. columnsToInclude. the ability to synchronize across multiple SQL Azure datacenters makes it easy to scale out loads across geographies. . I’ll be rather specific in this example. I create a connec. filters may be used. Note: If x64 is the target platform.ScopeName))) { onPremScopeConfig.ing the deployment architecture and data flow. 1. Figure 5 Synchronizing Data with Filters By segmenting the flow of data through multiple SQL Azure databases. In addition. 2. Identify the data to be synchronized and the direction of orch. are the steps to set up a synchronization relationship: SyncOrchestrator orch= new SynOrchestrator(). I can specify filters. the entire database could be provisioned. the particular columns.ly/dCt6T0. Because this is the first time that Asia Filter the scope has been provisioned in the database.LocalProvider = new SqlSyncProvider(ScopeName.Apply(). At its simplest. In the case that it’s desirable to horizontally partition or otherwise filter the data. Add necessary filters.S. That code looks something like this: SqlConnection LocalConnection = new SqlConnection(LocalConnectionString). SqlConnection AzureConnection = new SqlConnection(AzureConnectionString).ScopeExists(authorsScopeDesc.ScopeExists(authorsScopeDesc.Direction = SyncDirectionOrder. The next step is to U. 4. I can focus on optimiztion to the local database and retrieve a definition (DbSyncTable. if (!(onPremScopeConfig. For each structure that needs to be synchronized.ly/gKQODZ). Synchronizing the Data Once the databases are properly provisioned.) tion.Synchronize(). orch. This involves the use of the SyncOrchestrator object. AzureConnection). I’ll focus on the latter for the moment.1 (bit.With simple replication of data handled. SqlSyncProvider LocalProvider = new SqlSyncProvider(ScopeName. ScopeProvisioning) used to synchronize the data. it’s pretty simple to get them synchronizing. authorsScopeDesc). if (!(azureScopeConfig.ScopeName))) { azureScopeConfig. I’ll specify Azure can really be a huge benefit in branch-node architectures. here SqlSyncProvider AzureProvider = new SqlSyncProvider(ScopeName.S. this in combination with SQL to the scope (DbSyncScopeDescription). 3. } A good example of a console app to provision or sync a local and SQL Azure database can be found on the Sync Framework Team Blog at bit. 5.RemoteProvider = new SqlSyncProvder(ScopeName.Figure 4 Provisioning Scope // Create a provisioning object for "customers" and // apply it to the on-premises database SqlSyncScopeProvisioning onPremScopeConfig = new SqlSyncScopeProvisioning(onPremiseConn. 22 msdn magazine Target-Based Filter Target-Based Filter Forecast: Cloudy . or it can be limited to given columns. LocalConnection). as shown (Europe) Europe Distributor Branch Servers Filter in Figure 4. there will be some SQL Azure new tables for storing scope infor(Asia) mation and also a table specifically Mobile Users for tracking the Authors scope that was provisioned in the databases. This will be used to define the scopes (SqlSyncorch. I can put the data closer to simply synchronize the entire table. you must subsequently add it to the scope. AzureConnection). a bit of code will Sync need to be added to get the descripAgent SQL Azure (U. Provision the databases and tables for synchronization. LocalConnection). Limiting the sync relationship to the ultimate consumer and optimize the bandwidth usage (and specific columns is a good way to optimize bandwidth usage and hence charges) by only synchronizing the data that matters for that speed up processes (see Figure 3). orch. Using the Sync Description) for the table to be synchronized and add that table Framework. Create and run the process to synchronize. data flow. as it helps convey the message. which is the magic behind the curtains that identifies the changes and moves the changes between them. a build target for x64 will need to be added or the SyncOrchestrator won’t be able to resolve its dependencies. It requires the creation of a SqlSyncProvider for each end of the activity with the scope specified. but this isn’t necessary if the desire is to Using the combination of the two. Data and Geographic Dispersion and I’ll start with databases in place at both ends. machine. I can reduce the risk of being down. authorsScopeDesc). Filter grab a scope-provisioning object Workstations and POS and use it to provision each database if the scope doesn’t already SQL Azure Corp Data Back-of-House/ exist in that database. or only specific tables. Download and install the Sync Framework 2.Apply(): } // Provision the SQL Azure database from the on-premises SQL Server database SqlSyncScopeProvisioning azureScopeConfig = new SqlSyncScopeProvisioning(azureConn.DownloadAndUpload. Because it’s SQL Azure.FilterClause = "[authors].Tables["authors"]. If I want it to synchronize in both directions based on the filter. for end users who might travel between locations where it would be nice to be location-aware. provisioning and managing the infrastructure.AddFilterColumn("state"). Instead of using data servers in various geographical areas. one gets the performance. scalability and reliability without all of the headaches of designing. the sync agent could locate itself and reach out to sync data specifically for the current location. whether a single instance or multiple ones.com . THANKS to the following technical expert for reviewing this article: David Browne msdnmagazine.ly/dpyMP8). how.Tables["authors"]. like so: onPremScopeConfig. one can achieve fine-grained control over what.region or data segmentation. Additionally. onPremScopeConfig. Enabling filtering is no harder than provisioning a synchronization scope. A couple of examples of this are current stats or alerts for workers walking into a manufacturing/plant environment and current-day sales for regional managers of retail chains (see Figure 5). or UT. Once the databases are properly provisioned. improving the user experience as it relates to data availability and freshness. Thus. The needed change is simply to add two lines of code for each filter that’s being added: one line to add a filter column to the table and a second to add the filter clause. it will need to be added to both scopes as they’re provisioned on each end. can really enhance the data availability and overall performance when synchronizing to nodes by adding that everimportant layer of indirection. there could be multiple scopes in existence that have different filters—or have none. By spreading the data around geographically and implementing scopes that make sense for synchronizing particular data with a particular frequency. where he works with both enterprise customers and ISVs designing and prototyping software solutions to meet business and market demands. where I’ll expand on the implementation and show how Windows Azure can be added into the mix for synchronization using the latest Sync Framework 4.0 CTP released in October (bit. and those clients in that area can synchronize to it. For my sample. it’s pretty simple to get them synchronizing. JOSEPH FULTZ is an architect at the Microsoft Technology Center in Dallas. which is basically a “where” condition. data can simply be synchronized to an instance of SQL Azure in that geographical area. Go Forth and Spread the Data Adding SQL Azure to the mix. Look for next month’s column. when and how much data flows across the wire.[state] = 'UT'". I’m adding a filter based on state and synchronizing only changes for the state of Utah. He’s spoken at events such as Tech·Ed and similar internal training events. The workflow is hosted in a workflow-specific WCF ServiceHost (the WorkflowServiceHost). long-running workflow services. which exposes WCF endpoints for these messages. They need to be scalable to meet the changing needs of the business.NET Framework. deploy and configure trackable. and a lot to code. business logic. Fortunately. Workflows are a good example of business processes that have been codified into applications. Windows Communication Foundation. This integration is seamless to the developer and is done using specific messaging activities to receive WCF messages in the workflow. You’re probably already familiar with the . Technologies discussed: Windows Server AppFabric. often.microsoft.. And. They can include human workflows. use and modify. WF and Windows Server AppFabric. Windows Workflow Foundation Code download available at: code.W O R K F LO W S E R V I C E S Scalable. too.NET Framework and Windows Server AppFabric provide the tools you need to to create. two of them can be used to receive information. and the ability to easily enter data and retrieve status. They embody all of those elements This article discusses: • Creating a workflow service • Simplifying workflow persistence • Workflow tracking • Scaling the workflow service I mentioned: human business needs. They need to be simple to build. Although those scenarios are different. compliance and debugging. they need some form of logging for status. In this article I’ll walk you through the process of building a simple scalable workflow service using WCF. business logic exposed through services. Creating a Workflow Service To create a workflow service you need to combine two technologies: WCF and WF. Among the messaging activities group.com/mag201101AppFabric 24 msdn magazine . coordination between people and applications. coordination of presentation layers and even application integration. Long-Running Workflows with Windows Server AppFabric Rafael Godinho Business processes can cover a wide variety of application scenarios. successful business processes have a few things in common.msdn. allowing the workflow to accept messages from external clients as Web service calls: the Receive activity and the ReceiveAndSendReply template. That’s a lot for an application to do. the Microsoft . Windows Server AppFabric is a set of extensions for Windows Server that includes caching and hosting services for services based on Windows Communication Foundation (WCF) and Windows Workflow Foundation (WF). Because Visual Studio the first Receive activity of the workflow. it automatically creates a To create a correlation in Visual Studio 2010. its activities are executed and the workflow stays idle. Submit Expense Report However. Correlating Calls Sometimes a business process can receive more than one external call. When the approval call associating it with a SendReply activity. It can be initialized using an Initialize• CanCreateInstance is used to determine Correlation activity or by adding a value to if the workflow runtime must create a the CorrelationInitializers. leaving the whole process consistency types. it’s important to notice that persistence is not allowed between the Receive and SendReplyToReceive pair. which the call is associated. has a wizard to help create the XPath query. a new workflow instance is created at the first call. from clients. created workflow instance to continue the process. it could call any receive almost any kind of data. the Workflow Designer manager can review the report and approve • ServiceContractName is used to create or deny the expenses (see Figure 3). or if it will reuse an and ReceiveReply. a property of some new workflow instance to process the activities. but does not send them a reply. Later. When that happens. To create If used alone. the workflow instance leaves the idle state and continues to be executed. Manager If an activity attempts to explicitly persist the workflow instance in the no-persist zone. application-defined classes or even XMLat risk. In this way. WF adds an correlates the received expense report ID with the previously option to the Visual Studio toolbox called ReceiveAndSendReply. waiting for subsequent calls. Otherwise. This type of activity has many properties. an employee by the service. Figure 1 shows an example subsequent calls to the pending workflow with the Workflow Designer of a Receive activity on the workflow designer. returns the response to the employee client application. the workflow runtime To help implement the request-response pattern. This is much like WCF Figure 2 ReceiveAndSendReply on submits the expense report data. existing one using correlation techniques. SendReply. A possible scenario to use correlation is an • Content indicates the data to be received expense report workflow. A correlation is represented as an XPath but four of them are extremely important query to identify particular data in a specific to remember: message. waiting for the manager can also be used to implement a request-response pattern by to approve or deny the expense report. Send incoming message. This initialization process can be done in I’ll discuss correlation in more detail later. it won’t persist even if the host is configured to persist workflows when they become idle. When a later call is made. service contracts grouping service operations inside the In this scenario the correlation is created when the workflow generated Web Services Description Language (WSDL). like built-in data instance. This kind of activity Then the workflow instance becomes idle. meaning Approve/Deny Expense Report if the workflow instance becomes idle. code or using the workflow designer from You’ll probably want to set it to true on Visual Studio 2010. is made by the manager client application. such as: Receive. When dropped on the workflow designer. like message-exchange pattern. the Receive activity implements a one-way a correlation you need some context-identifying information. The idea behind the ReceiveAndSendReply template is to do some processing between the Receive and SendReplyToReceive actions. first select in the pair of pre-configured Receive and SendReplyToReceive activities workflow designer the activity where the correlation is going to within a Sequence activity (see Figure 2).com Approve Expense Report Deny Expense Report Figure 3 Expense Report Sample Scenario January 2011 25 .A Receive activity is used to receive inforpreviously created workflow instances to conmation to be processed by the workflow. Receive activity. a fatal exception is thrown. First. A no-persist zone Employee is created and lasts until both activities have completed. This is called correlation—you correlate Figure 1 A Receive Activity on serializable types. • OperationName specifies the service it’s the easier—and probably the preferable— operation name implemented by this way for most developers. It can tinue processing. his service operation contract parameters. the workflow aborts and an exception is returned to the caller. which is used to receive information the expense report ID (which is probably a unique ID already). the workflow runtime must have a way to use information received on later calls and distinguish between the msdnmagazine. After doing that. You can also employ Windows Server AppFabric on PCs running Windows Vista or Windows 7 for development and testing. WF 4 already has a SQL Server instance store to be used out of the box. Because all of those activities are pretty similar. hours.NET Framework has some types of correlation. The correlation handle is a variable the workflow runtime uses to store the correlation data and is automatically created by Visual Studio. sometimes activities more related to business rules are needed. Another option. you need to decide where it will run. In the SendReply activity. In my example. It can take minutes. WF provides a persistence framework capable of storing a durable capture of a workflow instance’s state—independent of process or computer information—into instance stores.CreateExpenseReport( Amount. Most of the heavy lifting is done in the 26 msdn magazine Figure 5 CorrelatesOn Definition Workflow Services . days or even longer. managing. but because I need to query part of the information exchanged with the client—in other words. Because I only need to execute CLR code and don’t need to interact with the WF runtime. Simplified Workflow Persistence Computers still have a limited set of resources to process all of your business processes. because WF is very extensible.Figure 4 Setting the XPath Query Correlation Execute method. Approve and Deny expense report. most of the time it can be idle simply waiting for a response. The complete workflow can be seen in Figure 6. set. For long-running processes. there are three activities needed for submitting and approving expense reports: Create. When I click the arrow. such as other systems or end users. To continue the workflow after the expense approval is made. some of which I’ve used to send and receive information here. This is done by setting the CorrelatesOn property. } public InArgument<string> Description { get. return expenseReportManager. however. the correlation type and the XPath Queries. and the XPath Queries property must be set to the same key and expense report ID received on the expense report approval message. This displays the Add Correlation Initializers dialog box (see Figure 4) where you can configure the correlation. both declared as InArgument. The traditional choice has been to run it on your own hosting environment. set. Three items must be set: the correlation handle. From this dialog. the XPath queries can be set to the expense report ID. and if it depends on external entities.ExpenseReportManager expenseReportManager = new Data. Visual Studio checks the message content and shows me a list to select the appropriate information. you may have no control over the total amount of time from the beginning of the process to its end. be initialized. a content-based correlation—my best option is to use the Query correlation initializer. this is the activity that returns the expense report ID to the client. is to take advantage of Windows Server AppFabric.Get(context). The next step is to set the correlation type. Though IIS and WAS already support service hosting. Based on the scenario I’ve discussed so far. It accesses a class that uses the Entity Framework to handle database access and to save the expense report information. an enhancement to the Application Server role in Windows Server 2008 R2 for hosting.ExpenseReportManager(). Windows Server AppFabric offers a more useful and manageable environment that integrates WCF and WF features such as persistence and tracking with IIS Manager. Description. IIS or Windows Process Activation Services (WAS). } } Hosting the Workflow Service After the workflow service is created. I could create my own instance store to persist the workflow instance state if I wanted to. I’m only going to show the code of CreateExpenseReportActivity: public sealed class CreateExpenseReportActivity : CodeActivity<int> { public InArgument<decimal> Amount { get. the easiest option to create an activity is to inherit from CodeActivity. Though they are useful. Once the work- The activity receives the expense amount and description. the CorrelatesWith property needs to be set to the same handle used to initialize the correlation for the SendReplyToReceive activity. However. securing and scaling services created with WCF or WF. and on the other end the Entity Framework returns the expense report ID. I set the CorrelationInitializers property in the Properties window by clicking the ellipsis button. WF comes with a set of general-purpose activities called Base Activity Library (BAL).Get(context)). and there’s no reason to waste computing resources on idle workflows. The . the correlation must be used by the corresponding Receive activity. Just click the ellipsis button near the property in the Properties window to open the CorrelatesOn Definition dialog box (see Figure 5). } protected override int Execute(CodeActivityContext context) { Data. Due to the detached nature of long-running processes. With the database already created. This is especially important for scaling out via a load balancer. all the other steps are accomplished with IIS Manager. You also have the option to set how long the workflow runtime will take to unload the workflow instance from memory and persist it on the database when the workflow becomes idle. how to reproduce it and. The first step to configure persistence is to set up the SQL Server database using the Windows Server AppFabric Configuration Wizard or Windows PowerShell cmdlets. or just create the AppFabric Figure 7 Configuring Workflow Persistence schema. The whole process is transparent to the workflow runtime. The default value is 60 seconds. Windows Server AppFabric has an easy way to set up and maintain integration with WF persistence features. it can be unloaded to preserve memory and CPU resources. extending the default WF persistence framework. In IIS Manager. how to correct the problem and keep the system up. If you Figure 6 Complete Expense Report Workflow msdnmagazine.flow instance is idle and has been persisted. as a developer you usually need to analyze a bunch of logs to discover what happened. Web site or application) and choose Manage WCF and WF Services | Configure to open the Configure WCF and WF for Application dialog. When a problem occurs. If you set the value to zero it will be persisted immediately. or eventually it could be moved from one node to another in a server farm. You can see that you have the option to enable or disable workflow persistence. Workflow Tracking Sometimes something can go wrong with processes that interact with external users and applications.com Figure 8 Enabling Tracking on Windows Server AppFabric January 2011 27 . then click Workflow Persistence (see Figure 7). it can be even worse on those scenarios. right-click the node you want to configure (server. most important. The wizard can create the persistence database if it doesn’t exist. which delegates the persistence tasks to AppFabric. the balancer redirects the Dave Cliffe. be sure to read Leon Welicki’s article. and the server correlates the request and used in scenarios that need minimal tracking overhead.com/windowsserver/ee695849. Windows the January 2009 issue (msdn. and service calls another service. • Health Monitoring is the default monitoring level and Closing Notes contains all the data captured at the Errors Only level. when the client application accesses the service to approve While configuring tracking in Windows Server AppFabric. On the expense report scenario. plus As you’ve seen. Now expense approval from the workflow runtime.com/magazine/ff646977). In the Configure WCF and the client and. The same way WF has an W extensible framework to persist C WF F idle instances. spreading workflows across threads. persists the running you can choose to enable or disable the tracking and also the track. NLB routes local partners adopt Microsoft technology. processes and support a great number of requests from clients. you or deny the expense report. returns the generated ID to Wizard or Windows PowerShell cmdlets. “Visual Design of Workflows Scaling the Workflow Service Because Windows Server AppFabric extends the Application Server with WCF and WF 4. it inherits the highly scalable infrastructure (msdn.microsoft. see Michael Kennedy’s load balancer (NLB). As a result.microsoft. “Web Apps That Support Long-Running Operations. persistence and tracking on a load-balancing environment is a • End-to-End Monitoring contains all data from level Health powerful technique for running those services in a scalable manMonitoring plus additional information to reconstruct ner.microsoft. Windows C WF F Server App Fabric uses this extensibility to improve the built-in Client WF tracking functionality. Ron Jacobs and Leon Welicki NLB 28 msdn magazine Persistence Persistence Workflow Services . worries about infrastructure complexity. For further information about designing workflows with WCF and WF. because the workflow becomes idle waiting for the WF for Application dialog discussed earlier. Later. This is used in scenarios where a productivity. ing level. as the name suggests. the NLB redirects the request to an availcan choose five monitoring levels: able Windows Server AppFabric instance (it can be a different server • Off has the same effect as disabling monitoring and is best from the first service call). And for a deeper discussion of longfrom its predecessor and can be run on a server farm behind a network running processes and workflow persistence. which be accessed via either the Windows Server AppFabric Configuration saves the expense data in the database. the instance in memory continues processing. it also has an extenMicrosoft SQL Server sible framework to provide visibility into workflow execution. when a client first accesses THANKS to the following technical experts for reviewing this article: the service to create an expense report. You can contact Godinho through his blog at blogs. saves the and warnings.” from the May 2010 issue of MSDN Magazine Role from Windows Server. recording execution events on a SQL Server database. is the most verbose This allows developers to create a fully scalable solution that’s ready level and is useful in scenarios where an application is in to run on a single machine—or even large server farms—with no an unhealthy state and needs to be fixed. This mode is best for high-performance approval on the database and returns to the client when it’s done. allowing proactive actions on running services. restores the workflow instance from the persistence database. Persistence Windows Server AppFabric which transparently instruments Database a workflow.use WF. as shown in Figure 8. • Error Only gives visibility to only critical events like errors Now. some additional processing data. Figure 9 Workflows in a Scalable Environment The Windows Server AppFabric tracking configuration is similar to that used for persistence and can requests to an available Windows Server AppFabric instance.com/magazine/dd296718). Client This framework is called tracking. processes and even machines. An example of a workflow service scalable environment can be seen in Figure 9. click Monitoring.com/rafaelgodinho.instance in the database. see the Windows Server AppFabric is an excellent choice to host long-running workflow Server Developer Center at msdn. The combination of these features can increase operations the entire message flow.msdn. the use of workflow services with correlation. scenarios that need only minimal error logging. requests to the available AppFabric instance. recording key events W during its execution.” from and track workflow instances when needed. you already get this kind of Client Windows Server AppFabric logging built into the framework. It has two Windows Server AppFabric instances. • Troubleshooting. RAFAEL GODINHO is an ISV developer evangelist at Microsoft Brazil helping both running copies of the same workflow definition. For details about Windows Server AppFabric. You’ve also seen that it has the ability to persist article. WMV. convert and burn DVD images. WF. DICOM: Full support for all IOD classes and modalities de ned in the DICOM standard (including Encapsulated PDF/CDA and Raw Data). C API. WPF. DVD and more. OGG. Multimedia: Capture. LEADTOOLS provides developers easy access to decades of expertise in color. AVI. read and write 1D and 2D barcodes for multithreaded 32 & 64 bit Microsoft. DVD: Play. Medical Image Viewer: High level display control with built-in tools for image mark-up. Web Forms. Medical Workstation Framework: Set of . full-featured. VRT and SSD. MP4. HP. development with PDF. the US user lled data. bookmarks and annotations.com/msdn (800) 637-1840 Document Barcode Multimedia DICOM Medical Form Recognition & Processing Vector . window level.Install LEADTOOLS to eliminate months of research and programming time while maintaining high levels of quality. C++ Class Lib. Veterans Affairs Annotations: Interactive UI for document mark-up. document. medical. transforms. GE. Barcode: Auto-detect. DVR: Pause. XPS. Forms Recognition & Processing: Automatically identify and classify forms and extract Siemens. Silverlight. . line and technologies border removal.NET. rewind and fast-forward live capture and UDP or TCP/IP streams. DOC. 4 and Windows Phone Imaging SDK. XML and Text output. Silverlight: 100% pure Silverlight 3.NET medical and PACS components that can be used to build a full featured PACS Workstation application. OCR/ICR/OMR: Full page or zonal recognition for multithreaded 32 and 64 bit Kodak. write and view searchable PDF with text. Develop your Image Formats & Compression: Supports 150+ image formats and compressions including TIFF. MPEG Transport Stream: With DVR for UDP and TCP/IP streams & auto-live support. inverted text correction and more for optimum results in OCR and Barcode used by recognition. JPEG2000. Stream from RTSP Servers. COM & more! Free 60 Day Evaluation! www. JBIG2 and CCITT G3/G4. WPF. application Scanning: TWAIN & WIA (32 & 64-bit). hole punch. Silverlight. redaction and image measurement Hospitals. Document Cleanup/Preprocessing: Auto-deskew. PACS: Full support for DICOM messaging and secure communication enabling quick implementation of any DICOM SCU and SCP services. vector and multimedia imaging development. ActiveX and COM. (including support for DICOM annotations). performance and functionality. EXIF. despeckle. MinIP. play. create. measurement. auto-detect optimum driver settings for high speed with the scanning.leadtools. stream and convert MPEG. web-based medical image delivery and viewer applications. PDF. Medical Web Viewer Framework: Plug-in enabled framework to quickly build high-quality. same robust Image Processing: 200+ lters. color conversion and drawing functions imaging supporting region of interest and extended grayscale data. Air Force and PDF & PDF/A: Read. MRP. Viewer Controls: Win Forms. development. ISO. MP3. Canon. PDF/A. zoom/pan. images. grayscale. WCF. and LUT manipulation. cine. Sony. 3D: Construct 3D volumes from 2D DICOM medical images and visualize with a variety of methods including MIP. In Windows Workflow Foundation 4 (WF 4). You’ll learn how to write your own control This article discusses: • The activities type hierarchy in Windows Workflow Foundation 4 • Composite and leaf activities • Bookmarks and execution properties • GoTo activities with a very simple control flow activity and add richness as we progress. ending up with a new and useful control flow activity.microsoft. The base types available to create your own custom activity are defined in the activities type hierarchy in Figure 1.0 30 msdn magazine . You can author activities visually and declaratively using the WF designer. Flowchart. The runtime only enforces some simple rules (for example. But first. The source code for all of our examples is available for download. You can write your own and use them in combination with the ones provided in the box. Pick. let’s start with some basic concepts about activities to set some common ground. Some examples in the WF 4 activity toolbox include Sequence. Some of those activities govern the semantics of how other activities are executed (such as Sequence. or imperatively writing CLR code. WF 4 includes more than 35 activities. We call these leaf activities. The WF runtime doesn’t have first-class knowledge of any control flow like Sequence or Parallel. and are known as composite activities. You’ll find a detailed explanation of this type hierarchy in the MSDN Library at msdn.com/library/dd560893. walk. among others. If. everything is just activities. Parallel and ForEach). ForEach. Flowchart and Switch.W I N D O W S W O R K F L O W F O U N D AT I O N 4 Authoring Control Flow Activities in WF 4 Leon Welicki Control flow refers to the way in which the individual flow activities following a “crawl. control flow activities govern the execution semantics of one or more child activities. a comprehensive set that can be used to model processes or create new activities. “an activity can’t complete if any child activities are still running”). and as such they derive from other existing types. Others perform a single atomic task (WriteLine. From its perspective. InvokeMethod and so forth). run” approach: we’ll start instructions in a program are organized and executed. a workflow program is a tree of activities that are executed by the WF runtime. Activities Activities are the basic unit of execution in a WF program. WF control flow is based on hierarchy—a WF program is a tree of activities. Control flow options in WF 4 are not limited to the activities shipped in the framework. Parallel. the base class that provides access to the full breadth of the WF Technologies discussed: Windows Workflow Foundation 4. In this article I’ll focus on activities that derive from NativeActivity. WF activities are implemented as CLR types. and that’s what this article will discuss. GetValue(this. • At execution time: If the condition is true and the body is not null. All NativeActivities must override this method. Once you’ve decided the base class for an activity. this context is used to retrieve the value of the Condition argument (context. Workflow authors use arguments to bind activities to the environment by writing expressions.Body != null) context. Now that I’ve covered some core activity basics. Notice that I say schedule and not execute. and they declare variables at different scopes in the workflow to share data between activities. Here’s how this activity should work: • The activity user must provide a Boolean condition. • The activity user can provide a body—the activity to be executed if the condition is true. } public Activity Body { get. } } This code is very simple. set.GetValue(this. WF 4 provides an If activity that includes Then and Else child activities. Activity authors use arguments to define the way data flows in and out of an activity.ScheduleActivity(this.Body != null) context. set. and they use variables in a couple of ways: • To expose a user-editable collection of variables on an activity definition that can be used to share variables among multiple activities (such as a Variables collection in Sequence and Flowchart).com .Condition) && this.ScheduleActivity(this. you write the string “Rest!” to the console: January 2011 31 msdnmagazine. In ExecuteIfTrue. Data is defined using arguments and variables. Together.GetValue(this. Therefore. Arguments are the binding terminals of an activity and define its public signature in terms of what data can be passed to the activity (input arguments) and what data will be returned by the activity when it completes its execution (output arguments). The WF runtime will enforce this validation when preparing the activity for execution: [RequiredArgument] public InArgument<bool> Condition { get. Control flow activities are composite activities that derive from the NativeActivity type because they need to interact with the WF runtime. but there’s more here than meets the eye. so it needs to schedule another activity. variables and arguments combine to provide a predictable model of communication between activities. this consists of a Boolean input argument of type InArgument<bool> named Condition that holds the condition to be evaluated.Body). For those cases we want an activity that executes another activity based on the value of a Boolean condition.runtime. The activity data model defines a crisp model for reasoning about data when authoring and consuming activities. } The most interesting piece of code in this activity is the Execute method. } public Activity Body { get. if the current day is Saturday. The XAML syntax is based on this pattern for designing types. Here’s an implementation for an ExecuteIfTrue activity that behaves in exactly this way: public class ExecuteIfTrue : NativeActivity { [RequiredArgument] public InArgument<bool> Condition { get. which is where the “action” happens. This means the type will be XAML serialization-friendly. you have to define its public signature. but it may also include implementing custom cancellation using CancellationScope or Pick. execute the body. } Note as well that the type has been designed following the createset-use pattern. There’s not much to this activity: It executes a contained activity if a condition is true. often we only want to provide the Then. it derives from NativeActivity because it needs to interact with the WF runtime to schedule children. Parallel or Flowchart). where the type has a public default constructor and public read/write properties. and the Else is just overhead. • To model the internal state of an activity. Sequence. In this example. The Execute method receives a NativeActivityContext argument. Variables represent temporary storage of data. which is our point of interaction with the WF runtime as activity authors. set.Condition)) and to schedule the Body using the ScheduleActivity method. This argument is required. creating bookmarks with Receive and persisting using Persist. Activity CodeActivity NativeActivity AsyncCodeActivity Activity<TResult> CodeActivity <TResult> AsyncCodeActivity <TResult> NativeActivity <TResult> Figure 1 Activity Type Hierarchy A Simple Control Flow Activity I’ll begin by creating a very simple control flow activity called ExecuteIfTrue. The WF runtime does not execute the activities right away.Body). The following code snippet shows how to use this activity. ExecuteIfTrue executes a child activity if a condition is true. instead it adds them to a list of work items to be scheduled for execution: protected override void Execute(NativeActivityContext context) { if (context. In ExecuteIfTrue. and a property of type Activity named Body with the activity to be executed if the condition is true. The Condition argument is decorated with the RequiredArgument attribute that indicates to the WF runtime that it must be set with an expression.Condition) && this. set. let’s start with the first control flow activity. } public ExecuteIfTrue() { } protected override void Execute(NativeActivityContext context) { if (context. Most commonly this is to schedule other activities (for example. The next step is to define the public signature for SimpleSequence. There’s one private member in the class that’s not part of the signature: a Variable<int> named “current” that holds the index of the activity being executed: // Pointer to the current item in the collection being executed Variable<int> current = new Variable<int>() { Default = 0 }.Now.Set(context. • At execution time: • The activity contains an internal variable with the index of the last item in the collection that has been executed. public SimpleSequence() { } // Collection of children to be executed sequentially by SimpleSequence public Collection<Activity> Activities { get { if (this. • If there are items in the children collection. var act = new ExecuteIfTrue { Condition = new InArgument<bool>(c => DateTime. schedule the first child.activities == null) this.Tuesday). you want to keep it private and not expose it to users of SimpleSequence. next). • When the child completes: • Increment the index of the last item executed. This activity is almost functionally equivalent to the Sequence shipped in the product. WorkflowInvoker.variables = new Collection<Variable>().current). and a collection of variables (of type Collection<Variable>) exposed through the Variables property. protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata. But don’t be fooled by the simplicity of that code—it’s actually a fully functional control flow activity! Scheduling Multiple Children The next challenge is to write a simplified version of the Sequence activity. The goal of this exercise is to learn how to write a control flow activity that schedules multiple child activities and executes in multiple episodes. } protected override void Execute(NativeActivityContext context) { // Schedule the first activity if (this. // Store the index in the collection of the activity executing this. schedule the next child. You also want it to be saved and restored when the activity is persisted.Count > 0) context. ActivityInstance completed) { // Calculate the index of the next activity to scheduled int currentExecutingActivity = this. so accessing these properties never results in a null reference. Implementation variables are persisted when Windows Workflow Foundation 4 . int next = currentExecutingActivity + 1. The code is simple. The code in Figure 2 implements a SimpleSequence activity that behaves exactly as described.ScheduleActivity(this. about 50 lines. • Repeat.SetChildrenCollection(this. Collection<Variable> variables. it derives from NativeActivity because it needs to interact with the runtime to schedule children.ScheduleActivity(this.current.activities.Count) { // Schedule the next activity context. OnChildCompleted). They’re intended to be consumed by the activity author. • If the index is still within the boundaries of the children collection. so it needs to schedule other activities.Get(context). Note that these properties only have “getters” that expose the collections using a “lazy instantiation” approach (see Figure 3). } void OnChildCompleted(NativeActivityContext context.Activities[0]. this. return this. Variables allow sharing of data among all child activities.Activities[next]. 32 msdn magazine Again. SimpleSequence executes a collection of child activities in sequential order. this.Activities.current.SetVariablesCollection(this. Here’s how this activity should work: • The activity user must provide a collection of children to be executed sequentially through the Activities property. Implementation variables are variables that are internal to an activity. not the activity user. a fully functional control flow activity has been written in just a few lines of codes—in this case.activities).Activities.Figure 2 The SimpleSequence Activity public class SimpleSequence : NativeActivity { // Child activities collection Collection<Activity> activities. if (next < this. metadata.activities = new Collection<Activity>(). This makes these properties compliant with the create-set-use pattern. } } public Collection<Variable> Variables { get { if (this. Because this information is part of the internal execution state for SimpleSequence.Invoke(act).DayOfWeek == DayOfWeek. metadata.variables == null) this.AddImplementationVariable(this. You use an ImplementationVariable for this. Body = new WriteLine { Text = "Rest!" } }. Therefore. In this case it consists of a collection of activities (of type Collection<Activity>) exposed through the Activities property.. The first control flow activity has been created in 15 lines of code. but it introduces some interesting concepts. OnChildCompleted).variables..variables). // Pointer to the current item in the collection being executed Variable<int> current = new Variable<int>() { Default = 0 }. return this. } } } } } // If index within boundaries. execution is happening in pulses or episodes. invoke the OnChildCompleted method. new WriteLine { Text = "!" } } }.variables == null) this.variables). when a child completes. Its goal is simple: to provide a Sequence with support for GoTos. Because workflows can be persisted and unloaded from memory.AddImplementationVariable(this. The new control flow activity will be called Series.activities == null) this. this is a method that will be called once the activity execution is completed. If you want to use an ImplementationVariable in an activity. there can be a large time difference between two pulses of execumsdnmagazine. it will be called when the scheduled activity execution has been completed: protected override void Execute(NativeActivityContext context) { // Schedule the first activity if (this. when it wakes up it will “remember” the index of the last activity that has been executed. Implementing a New Control Flow Pattern Next. Clearly.activities. ActivityInstance completed) { // Calculate the index of the next activity to scheduled int currentExecutingActivity = this. I’m SimpleSequence and I have a collection of child activities. CacheMetadata is not that difficult.Activities[next].ScheduleActivity(this. } } public Collection<Variable> Variables { get { if (this. I didn’t implement CacheMetadata and relied on the default implementation to reflect on public members. Think about the If activity for a moment. } } The following code snippet shows how to use this activity.com January 2011 33 .current).activities = new Collection<Activity>()..Count > 0) context. Conceptually. return this. you need to explicitly inform the WF runtime. where the Figure 3 A Lazy Instantiation Approach public Collection<Activity> Activities { get { if (this. In the ExecuteIfTrue example. which in this case points to this same method. new WriteLine { Text = "Workflow" }. In this example. metadata. } tion. The CompletionCallback won’t be called when the activity is scheduled. Learning how to program for multiple pulses of execution is one of the biggest challenges in becoming an expert control flow activity author: void OnChildCompleted(NativeActivityContext context. Despite its rather frightening name.Activities. In this case.Get(context).Activities. This section demonstrates how to build your own control flow activity to support a control flow pattern that’s not supported out-of-the-box by WF 4.activities). You do this during CacheMetadata method execution.current.variables = new Collection<Variable>().SetVariablesCollection(this. metadata.OnChildCompleted).” You say this in WF terms using NativeActivityContext.Activities[0]. Moreover. I’m writing three strings to the console (“Hello. As I mentioned earlier. The WF runtime can’t automatically know about implementation variables.. I did need to implement it because the default implementation can’t “guess” about my desire to use implementation variables.the activity is persisted and restored when the activity is reloaded.Count) { // Schedule the next activity context. return this. WorkflowInvoker.variables.SetChildrenCollection(this. // Store the index in the collection of the activity executing this. Notice that when you schedule an activity.OnChildCompleted). in contrast. Again. // If index within boundaries. it’s important to be aware of the difference between scheduling and execution.ScheduleActivity. this. next).ScheduleActivity(this. you tell the WF runtime: “Please execute the first activity in the collection of activities and. without requiring any work on our part. I’ll create a complex control flow activity.Set(context.” In the SimpleSequence case. you’re not limited to the control flow activities shipped in WF 4.Invoke(act). and it’s actually the main reason I’ve included SimpleSequence in this article. The default implementation of CacheMetadata uses reflection to get this data from the activity. } } The OnChildCompleted method is the most interesting part of this activity from a learning perspective. SimpleSequence is saying: “Hi. a CompletionCallback is provided. int next = currentExecutingActivity + 1. For SimpleSequence. you supply a second argument that’s a CompletionCallback.” There’s no more than that in the SimpleSequence code for CacheMetadata: protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata. To make this clear—and continuing with the Sequence example—if an instance of SimpleSequence is persisted. these pulses can be executed in different threads. and two children: Then and Else. when you’re done. if there are activities in the collection. this. } I’ve authored my own SimpleSequence! Now it’s time to move on to the next challenge. if (next < this. When the next child is scheduled. processes or even machines (as persisted instances of a workflow can be reloaded in a different process or machine). a collection of variables and an implementation variable. In simple terms. This method gets the next activity in the collection and schedules it. this method will execute again to look for the next child and execute it. it’s actually simple: It’s the method where an activity “introduces itself ” to the runtime.current.” “Workflow” and “!”): var act = new SimpleSequence() { Activities = { new WriteLine { Text = "Hello" }. In CacheMetadata this activity would say: “Hi. The next interesting piece of code in this activity is the Execute method. Thus. I’m the If activity and I have an input argument named Condition. • When the child completes: • Lookup the completed activity in the Activities collection. The only difference from the previous example is that I’m manually registering the BookmarkName input argument within the WF runtime—adding a new instance of RuntimeArgument to the activity metadata: protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata. Again. When this property returns true. the important thing to understand now is that I’ll have an implementation variable that holds the activity instance being executed: Variable<ActivityInstance> current = new Variable<ActivityInstance>(). • Schedule the target activity in the set for execution and register a completion callback that will schedule the next activity. To implement this new control flow. I’m telling the runtime that this activity can cause the workflow to become idle. I’ll need to author two activities: Series. there are Activities and Variables collection properties. the implementation variable with the current activity instance and the bookmark name argument.SetVariablesCollection(this. I’ll start by implementing the Series activity. Before going further.AddArgument(new RuntimeArgument("BookmarkName". I’ll have an InvalidOperationException exception when executing the activity: protected override bool CanInduceIdle { get { return true. as in SimpleSequence. This is just more metadata that the activity provides to the WF runtime. metadata. If this property returns false and we create a bookmark. There’s also a string input argument named BookmarkName (of type InArgument<string>). an activity can share data with its descendants through these properties. Some of this code will look familiar from the previous examples. • The activity contains an internal variable with the activity instance being executed. 34 msdn magazine Series has a private member named “current” that contains the ActivityInstance being executed. from a user). Unlike programs directly targeting the CLR. • Store the activity that’s currently being executed in the “current” variable. ArgumentDirection. which change the point of execution to any direct child of the Series. • Find that activity in the activities collection. To recap. Here’s the execution semantics in simple terms: • The activity user must provide a collection of children to be executed sequentially through the Activities property. } The next new thing is the CanInduceIdle property overload. As before. Windows Workflow Foundation 4 . a leaf activity that I’ll use inside of Series to explicitly model the jumps. As a result. 3. and when the bookmark is resumed a block of code (the bookmark resumption callback) is invoked in reaction to the bookmark resumption. I’ll explain the actual details later. I’m following the create-set-use pattern in the activity type. When you use bookmarks. The default value for this property is false. This implies that the standard thread local storage (TLS) mechanisms can’t be directly leveraged to determine what context is in scope for a given work item. It’s a Sequence of activities. which can change the point of execution to any direct child of the Series. When an activity wishes to “block” pending a certain event. let me introduce bookmarks and execution properties. create bookmarks. • If there are items in the children collection.AddImplementationVariable(this. As in SimpleSequence. The workflow execution context introduces execution properties to an activity’s environment. it doesn’t have any more work to do as part of the current work item. I need to override this property and return true for activities that create bookmarks. • Repeat.In)). • If the GoTo bookmark is resumed: • Get the name of the activity we want to go to. Why is current a Variable<ActivityInstance> and not a Variable<int>? Because I need to get ahold of the currently executing child later in this activity during the GoTo method. Bookmarks are the mechanism by which an activity can passively wait to be resumed. as they’ll make the workflow go idle waiting for its resumption. Series derives from NativeActivity because it needs to interact with the WF runtime to schedule child activities. I’ll enumerate the goals and requirements for the custom control activity: 1. • Cancel the activity that’s currently executing. In CacheMetadata you provide runtime information about your activity: the children and variables collections.SetChildrenCollection(this. • In the execute method: • Create a bookmark for the GoTo in a way that’s available to child activities.current). though. The code example in Figure 4 shows the implementation for a Series activity that behaves exactly as described. a composite activity that contains a collection of activities and executes them sequentially (but allows jumps to any item in the sequence). workflow programs are hierarchically scoped trees that execute in a thread-agnostic environment. typeof(string). 2. • If the index is still within the boundaries of the children collection. schedule the next child. so that an activity can declare properties that are in scope for its sub-tree and share them with its children. instead of just a pointer to an item in a collection. } } Things get more interesting in the Execute method. schedule the first child. the next step is to define the public signature for Series. you can author your activities using a form of reactive execution: when the bookmark is created the activity yields.Activities).Variables). metadata. with the name of the bookmark to be created for host resumption. It can receive GoTo messages from the outside (for example. • Increment the index of the last item executed. It can contain GoTo activities (at any depth). it registers a bookmark and then returns an execution status of continuing. and GoTo. cancel children and use execution properties.next activity to be executed can be manipulated either explicitly from inside the workflow (through a GoTo activity) or from the host (by resuming a well-known bookmark). where I create a bookmark (internalBookmark) and store it in an execution property. I’ll discuss the implementation of this activity. metadata. This signals the runtime that although the activity’s execution is not complete. OnChildCompleted).Goto.CreateBookmark(this. // Schedule the activity ActivityInstance instance = context. (You’ll see later how the GoTo activity uses this bookmark. The reason to save it into an execution property is to make it available to all child activities. this. } } protected override void Execute(NativeActivityContext context) { // If there activities in the collection.current.BookmarkName. the GoTo method will be called because I provided a BookmarkCompletionCallback to CreateBookmark (the first parameter). // Activity instance that is currently being executed Variable<ActivityInstance> current = new Variable<ActivityInstance>().OnChildCompleted)).Add(GotoPropertyName.variables == null) this.Goto".current. ArgumentDirection. meaning it can be resumed multiple times and it will be available while its parent activity is in an executing state. } Now that you know about bookmarks and execution properties.Goto.variables = new Collection<Variable>().Closed) { // Find the next activity and execute it int completedActivityIndex = this.Figure 4 The Series Activity public class Series : NativeActivity { internal static readonly string GotoPropertyName = "Microsoft.) Notice that execution properties have names.Activities.Properties. It’s also NonBlocking.Activities[next].AddArgument(new RuntimeArgument("BookmarkName". This bookmark is a multiple-resume bookmark.Single().Series. // For externally initiated goto's. return this. This is a best practice: internal static readonly string GotoPropertyName = "Microsoft. I defined a constant (Gotomsdnmagazine.Activities). object obj) { // Get the name of the activity to go to string targetActivityName = obj as string.Get(context) != null) context. Collection<Variable> variables. Equals(targetActivityName)) .Activities. typeof(string). I’m already familiar with this.CreateBookmark) and saving it into an execution property (using context.ScheduleActivity(this. metadata.activities.MultipleResume | BookmarkOptions.SetChildrenCollection(this.ScheduleActivity(this. } } protected override void CacheMetadata(NativeActivityMetadata metadata) { metadata.Add).current).com PropertyName) with the name for the property in the activity.current. That name follows a fully qualified name approach. this. internalBookmark).. Once I’ve declared the bookmark. // Save the name of the bookmark as an execution property context. // Cancel the activity that is currently executing context. BookmarkOptions. so it won’t prevent the activity from completing once it’s done with its job.Goto"..AddImplementationVariable(this.Add(GotoPropertyName. this. Series.OnChildCompleted)).Activities .Variables).CreateBookmark(this. internalBookmark).State == ActivityInstanceState. return this.Activity). Bookmark b.SetVariablesCollection(this. let’s get back to the code.CreateBookmark(this..CancelChild(this.In)). . I’m ready to schedule my first activity.Samples. metadata.. Because that name is a string. NonBlocking). When that bookmark is resumed. // Create a bookmark for external (host) resumption if (this. // Set the activity that is executing now as the current this.Set(context.activities = new Collection<Activity>().IndexOf(completed. context. } } public Collection<Variable> Variables { get { if (this. } void OnChildCompleted(NativeActivityContext context.Set(context. // Schedule the first item in the list and save the resulting // ActivityInstance in the "current" implementation variable this.Get(context).Count > 0) { // Create a bookmark for signaling the GoTo Bookmark internalBookmark = context.activities == null) this.Activities. BookmarkOptions..Count) this.MultipleResume | BookmarkOptions. if (next < this. // Find the activity to go to in the children list Activity targetActivity = this. What I’m doing at the beginning of the Execute method is creating a bookmark (using context.Activities[0].Get(context)). BookmarkOptions. this. optional public InArgument<string> BookmarkName { get.Properties.CustomControlFlow. } } } void Goto(NativeActivityContext context. set.Where<Activity>(a => a.. NonBlocking). .Properties.ScheduleActivity(targetActivity. int next = completedActivityIndex + 1. ActivityInstance completed) { // This callback also executes when cancelled child activities complete if (completed.Goto. instance).NonBlocking). // Create a bookmark for signaling the GoTo Bookmark internalBookmark = context. because I did it in my previous January 2011 35 .Set(context. if (this. context.CustomControlFlow. // Save the name of the bookmark as an execution property context. } protected override bool CanInduceIdle { get { return true. // Child activities and variables collections Collection<Activity> activities.DisplayName. } } public Series() { } public Collection<Activity> Activities { get { if (this.variables.current.BookmarkName.MultipleResume | BookmarkOptions.Samples. metadata. The GoTo Activity The GoTo activity can be used only inside of a Series activity to jump to an activity in its Activities collection.BookmarkName. Next. The main difference is that I only schedule the next activity if the current activity successfully completed its execution (that is.ScheduleActivity returns an ActivityInstance that represents an instance of an activity being executed.Get(context)).current. and then I update its value with the ActivityInstance that has been scheduled in the previous block of code: // Cancel the activity that is currently executing context. // Create a bookmark to leave this activity idle waiting when it does // not have any further work to do.NonBlocking).BookmarkName. Bookmark b. this. The mechanics for this are simple: Because the host knows the name of the bookmark. the OnChildCompleted method should be executed: // Find the activity to go to in the children list Activity targetActivity = this. When the bookmark is resumed. Once I find the requested activity.Find(Series. this.Set(context. It receives some data as input.CreateBookmark(this. The GoTo method is arguably the most interesting. The code in Figure 5 shows the implementation for a GoTo activity that behaves exactly as described.MultipleResume | BookmarkOptions.ScheduleActivity(targetActivity. This is the method that gets executed as the result of the GoTo bookmark being resumed. GoTo derives from NativeActivity because it needs to interact with the WF runtime to create and resume bookmarks and use execution properties. • If the bookmark is found.OnChildCompleted)).Get(context) != null) context. this. as it’s very similar to the one in SimpleSequence: I look up the next element in the activities collection and schedule it. // Schedule the activity ActivityInstance instance = context. I cancel the activity instance that’s currently being executed and set the current activity being executed to the ActivityInstance scheduled in the previous step. } } protected override void Execute(NativeActivityContext context) { // Get the bookmark created by the parent Series Bookmark bookmark = context. this. Finally.Activities[0].TargetActivityName. passing the TargetActivityName. the Series will jump to the activity indicated. . The GoTo method is arguably the most interesting. Context. ActivityInstance is the actual instance. The way it works is very simple: It resumes the GoTo bookmark created by the Series activity in which it’s contained. the data is the name of the activity we want to go to: void Goto(NativeActivityContext context.CancelChild(this. Equals(targetActivityName)) . so the activity does not complete. } 36 msdn magazine . it can resume it with a jump to any activity within the Series: // Create a bookmark for external (host) resumption if (this. I schedule it.Where<Activity>(a => a. This argument is required. we create a bookmark that can be used by the host to jump to any activity within the series. Its public signature consists of the TargetActivityName string input argument that contains the name of the activity we want to jump to. First.Activities . // Resume the bookmark passing the target activity name context. reached the closed state and hasn’t been canceled or faulted). indicating the name of the activity we want to go to.Set(context.DisplayName. instance). activities. indicating that when the activity is done. • At execution time: • The GoTo activity will locate the “GoTo” bookmark created by the Series activity. Series will cancel this activity // in its GoTo method context. like a class. Activity is the definition. I look up the requested activity definition in the “activities” collection. object data) { // Get the name of the activity to go to string targetActivityName = data as string. } protected override bool CanInduceIdle { get { return true. BookmarkOptions.Single(). I’ll schedule the first activity in the collection and tell the runtime to invoke the OnChildCompleted method when the activity is done (as I did in SimpleSequence).ResumeBookmark(bookmark.. context.GotoPropertyName) as Bookmark.Get(context).. I decorated that argument with Windows Workflow Foundation 4 The OnChildCompleted method should be straightforward now. • It will be cancelled by the Series. Here’s a simple description of the execution semantics: • The activity user must provide a string TargetActivityName. We can have multiple ActivityInstances from the same Activity: // Schedule the first item in the list and save the resulting // ActivityInstance in the "current" implementation variable this. It’s similar to a GoTo statement in an imperative program. Get(context)).current. I pass it as a parameter of the CancelChild method of NativeActivityContext. OnChildCompleted). For both these tasks I use the “current” variable. // Set the activity that is executing now as the current this. set. it will resume it. like an object.CreateBookmark("SyncBookmark"). } } The target activity name is the DisplayName property of the activity.Figure 5 The GoTo Activity public class GoTo : NativeActivity { public GoTo() { } [RequiredArgument] public InArgument<string> TargetActivityName { get. In this case. which I assign to our current implementation variable. • It will create a synchronization bookmark. which is passed when the bookmark is resumed.Goto.Properties.ScheduleActivity(this. Let me clarify this a bit.current. DESIGN Design Applications That Help Run the Business Our xamMap™ control in Silverlight and WPF lets you map out any geospatial data like this airplane seating app to manage your business. Come to infragistics.com to try it today! Infragistics Sales 800 231 8588 Infragistics Europe Sales +44 (0) 800 298 9055 Infragistics India +91 80 4151 8042 @infragistics . Properties.microsoft. waiting for a stimulus to resume the bookmark. The most important part is in the Execute method.activities.com/netframework/aa663328 Endpoint. meaning that the WF validation services will enforce that it’s set with an expression. In this case. you can easily create your own. You’ll also find a series of Channel 9 videos on activity authoring best practices. // Resume the bookmark passing the target activity name context. Figure 6 Using GoTo in a Series var counter = new Variable<int>(). Else = new GoTo { TargetActivityName = "First Step" } }. Text = "Step 1" }. this.Get(c) + 1) }.microsoft. he worked as lead architect and dev manager for a large Spanish telecom company and as an external associate professor on the graduate computer science faculty at the Pontifical University of Salamanca at Madrid. Bob Schmidt and Isaac Yuen Windows Workflow Foundation 4 38 msdn magazine . new If { Condition = new InArgument<bool>(c => counter. References Windows Workflow Foundation 4 Developer Center msdn. If you want to learn more. LEON WELICKI is a program manager in the Windows Workflow Foundation (WF) team at Microsoft working on the WF runtime.the RequiredArgument attribute.microsoft. Series. Then = new WriteLine { Text = "Step 3" }. but this activity can be used to implement complex. var act = new Series { Variables = { counter}. a Community Technology Preview for State Machine is available at CodePlex with full source code. In this article I started with a simple control flow activity and then worked my way up to implement a custom control flow activity that adds new execution semantics to WF 4.msdn.CreateBookmark(“SyncBookmark”).Goto.microsoft. In this case. I first look for the bookmark created by the parent Series activity. When GoTo is cancelled. the control flow spectrum is not fixed. I rely on the default implementation of CacheMetadata that reflects on the public surface of the activity to find and register runtime metadata. new WriteLine { DisplayName = "First Step". Series will cancel this activity // in its GoTo method context. In WF 4. Prior to joining Microsoft. Once I find that bookmark.Find(Series. This is a simple example to illustrate the basic use of Series. writing custom activities has been dramatically simplified.Goto method being invoked (because it’s the bookmark callback supplied when the bookmark was created). this activity will still be in executing state. redo or jump to steps in a sequential process. Go with the Flow In this article I presented the general aspects of writing custom control flow activities. real-world business scenarios where you need to skip.tv: Activities Authoring Best Practices channel9.ResumeBookmark(bookmark. Rajesh Sampath. new WriteLine { Text = "The end!" } } }. Because the bookmark was stored as an execution property. Figure 6 shows an example using this activity. WorkflowInvoker.activityinstance RuntimeArgument Class msdn. I resume it. If the activities provided out-of-the-box don’t meet your needs. In this case I don’t want to schedule the next activity—I want to schedule the activity referenced by TargetActivityName. new Assign<int> { To = counter. Value = new InArgument<int>(c => counter. When this activity completes.Invoke(act).Properties.com/library/system. When I discussed the code for Series. passing the TargetActivityName as input data.Execute method is complete. the completion callback in Series (OnChildCompleted) looks for the next activity in Series.com/library/dd454495 The final line of code is the trickiest one: creating a bookmark for synchronization that will keep the GoTo activity running. I mentioned that it cancelled the activity being executed. THANKS to the following technical experts for reviewing this article: Joe Clancy. Activities = { new WriteLine { DisplayName = "Start". you can express any control flow pattern in WF and accommodate WF to the particulars of your problem. This bookmark resumption will result in the Series.Get(c) == 3). This bookmark enables this because it keeps the GoTo activity in an executing state while the target activity is being scheduled.Activities collection and schedules it. in this case. By writing your own custom activities.TargetActivityName. is Cancelled): // Create a bookmark to leave this activity idle waiting when it does // not have any further work to do.com/library/dd489425 ActivityInstance Class msdn. schedule it and cancel the activity that’s currently executing: // Get the bookmark created by the parent Series Bookmark bookmark = context. Dan Glick. when the GoTo. I look for it in context. That method will look for the next activity in the collection. Hence. I’m looping back to a previous state according to the value of a variable. Text = "Step 2" }.OnChildCompleted callback because it only schedules the next activity if the completion state is Closed (and.Goto is actually canceling a Goto activity instance that’s waiting for this bookmark to be resumed.Get(context)). there’s no action in the Series.com/shows/Endpoint/endpointtv-Workflow-and-CustomActivities-Best-Practices-Part-1/ Designing and Implementing Custom Activities msdn. To explain in more detail: The instance of GoTo activity was scheduled by the Series activity.GotoPropertyName) as Bookmark. com to try it today! Infragistics Sales 800 231 8588 Infragistics Europe Sales +44 (0) 800 298 9055 Infragistics India +91 80 4151 8042 @infragistics . Visit infragistics.DEVELOP Rich Business Intelligence Applications in WPF and Silverlight Robust Pivot Grids for WPF and Silverlight let your users analyze data to make key business decisions. com) is a lightweight framework for implementing MVVM with Silverlight and Windows Presentation Foundation (WPF). Silverlight has built-in support for concepts such as data binding.SatisfyImports(this). var container = CompositionHost. There’s a wealth of information and tools out there to help you get started. var catalog = new DeploymentCatalog(). Getting Started with MEF Because MEF is the engine that will connect all the parts of this example. For example.” in the February 2010 issue of MSDN Magazine (msdn. And when I say all kinds. in practice it has become a great platform for building any type of application.codeplex. MEF intercepts this request. Silverlight Code download available at: code. you can go much further than just putting a ViewModel in the DataContext of the View. finds a matching View and ViewModel.com/mag201101MEF 40 msdn magazine . and WCF RIA Services (silverlight. You can take your Silverlight application a step further with the Managed Extensibility Framework (mef. When the user navigates to a given URL.microsoft. looks at the route (a bit like ASP. StartupEventArgs e) { // Initialize the container using a deployment catalog.NET. In the rest of the article I’ll show you how to use MEF to get centralized management of the View and ViewModel creation. CompositionInitializer. making it relatively straightforward to create all kinds of applications. notifies the ViewModel of what’s happening and displays the View. } Technologies discussed: ASP. navigation. value converters. you don’t have to figure all this out on your own. I also mean enterprise applications. testability and separation of your UI from the logic behind it. the advantages of greater maintainability. If you’re not familiar with MEF already.NET 4 with the Managed Extensibility Framework.msdn. First you need to correctly configure MEF when the application starts by handling the Startup event of the App class: private void OnStart(object sender. it’s best to start with it.net/getstarted/riaservices) helps you easily access This article discusses: • Getting started with MEF • Extending Silverlight and MEF • Custom navigation • Handling the View and ViewModel Windows Communication Foundation (WCF) services and databases thanks to code generation. start by reading Glenn Block’s article.ComposeExportedValue<CompositionContainer>(container).microsoft. of course. in addition to the features already in Silverlight. Creating a Silverlight application with the Model-View-ViewModel (MVVM) pattern gives you. // Export the container as singleton.NET MVC). the MVVM Light Toolkit (mvvmlight. This framework provides the plumbing to create extensible applications using components and composition. out-of-browser operation and COM Interop. All this will be done by customizing the built-in Silverlight navigation. “Building Composable Apps in . And. // Make sure the MainView is imported. also known as MEF. container.Initialize(catalog).S I LV E R L I G H T E X P O S E D Using MEF to Expose Interfaces in Your Silverlight MVVM Apps Sandrino Di Mattia While many developers may think of Silverlight as a Web-centric technology.codeplex.com/magazine/ee291628).com). Once you have this. Inc. Visit infragistics. the Infragistics logo and NetAdvantage are registered trademarks of Infragistics. All rights reserved. Motion Framework is a trademark of Infragistics. Infragistics. Inc. Inc. .com to try it today! Infragistics Sales 800 231 8588 Infragistics Europe Sales +44 (0) 800 298 9055 Infragistics India +91 80 4151 8042 @infragistics Copyright 1996-2010 Infragistics.EXPERIENCE Beautiful Data Visualizations That Bring Your Data to Life Use our Motion Framework™ to see your data over time and give your users new insight into their data. this is why I used ComposeExportedValue in the Startup event of the App class.SetContentLoader().ContentLoader = Loader.NavigationPath = navigationPath.SatisfyImports(this).Behaviors> <fw:CompositionNavigationBehavior /> </i:Interaction. Key = NavigationPath. Another option would be to store the container as a static object. The constructor then forces the Import using the SatisfyImports method on the CompositionInitializer. use the whole key. CompositionNavigationBehavior. set. The first thing you can see is that the behavior wants a CompositionContainer and a CompositionNavigationLoader (more on this later) because of the Import attributes. } public string Key { get. } public ViewModelExportAttribute(Type viewModelContract.JournalOwnership = JournalOwnership.Split(new char[] { '/' }. It’s now possible from anywhere in the application to ask for an INavigationService (that wraps around a Frame). it’s important to register the instance of this container as an exported value. When the container was created. frame. this Frame doesn’t contain a UriMapper (where you could link Customers to a XAML file. Container.Behaviors> </navigation:Frame> This is just a regular frame that starts by navigating to Customers.Figure 1 CompositionNavigationBehavior public class CompositionNavigationBehavior : Behavior<Frame> { private bool processed. such as /Views/Customers. } public string NavigationPath { get. When registering the instance of INavigationService the same thing happens.Automatic. // Get the key.ComposeExportedValue<INavigationService>(svc). set. } protected override void OnAttached() { base. StringSplitOptions. As you can see. Note that you should only use this method when you don’t have another choice. 42 msdn magazine Silverlight Exposed . IViewModelMetadata { . the instance of this container was also registered in itself.aspx).Interactivity assembly) allows you to extend existing controls. var split = NavigationPath. } [Import] public CompositionNavigationContentLoader Loader { get.. Key = split[0].OnAttached(). Figure 2 Creating the ViewModelExportAttribute [MetadataAttribute] [AttributeUsage(AttributeTargets. } The deployment catalog makes sure all assemblies are scanned for exports and is then used to create a CompositionContainer. set. Let’s take a look at what this CompositionNavigationBehavior does. string navigationPath) : base(typeof(IViewModel)) { this.Contains("/")) { // Split the path to get the arguments. the instance of this wrapper is registered in the container.RemoveEmptyEntries). [Import] public CompositionContainer Container { get. set. } } private void RegisterNavigationService() { var frame = AssociatedObject. processed = true.IsInDesignTool) CompositionInitializer. a NavigationService is created and wrapped around the Frame.RegisterNavigationService(). } public CompositionNavigationBehavior() { if (!DesignerProperties. The only thing it contains is my custom behavior. } } } Extending Silverlight Navigation Silverlight Navigation Application is a Visual Studio template that enables you to quickly create applications that support navigation using a Frame that hosts the content. This will allow the same container to be imported whenever required. as it actually couples your code to MEF. void Navigate(string path. if (NavigationPath != null && NavigationPath. an Import of CompositionContainer will always give you the same object. The great thing about Frames is that they integrate with the Back and Forward buttons of your browser and they support deep linking. such as a Frame in this case. A behavior (from the System. } private void SetContentLoader() { var frame = AssociatedObject. params object[] args). Without having to couple your ViewModels to a frame you get access to the following: public interface INavigationService { void Navigate(string path).ViewModelContract = viewModelContract. if (!processed) { this. Because the navigation will require this container to do some work later on. this. set. When the Frame is attached. } } Figure 1 shows the behavior. this. Now the CompositionNavigationBehavior asks for a CompositionContainer using the Import attribute and will get it after SatisfyImports runs. var svc = new NavigationService(frame).Class. Using ComposeExportedValue. As a result. but this would create tight coupling between the classes. MEF is the engine that will connect all the parts of this example.public Type ViewModelContract { get. which isn’t suggested. frame. } else { // No arguments.Windows. Look at the following: <navigation:Frame x:Name="ContentFrame" Style="{StaticResource ContentFrameStyle}" Source="Customers" NavigationFailed="OnNavigationFailed"> <i:Interaction. AllowMultiple = false)] public class ViewModelExportAttribute : ExportAttribute. if (viewModelMapping == null) throw new InvalidOperationException( String.Absolute).Equals(relativeUri. " + "Could not locate the ViewModel. var viewModelFactory = viewModelMapping. targetUri.GetTitle(). But before jumping ahead. You can provide your own ContentLoader (that implements the INavigationContentLoader interface) to really provide something to show in the Frame.Metadata.CreateExport(). IViewModel viewModel) { // Close the ChildWindow if the ViewModel requests it. } else if (view is ChildWindow) { ChildWindow window = view as ChildWindow. In addition to the ViewModel interface. window.Format("Unable to navigate to: {0}. set.Id). } public bool CanLoad(Uri targetUri. Normally you’d write something like this: [Export(typeof(IMainViewModel))] public class MainViewModel Your attribute should implement an interface with the values you’ll want to expose as metadata. StringComparison.Figure 3 Custom INavigationContentLoader [Export] public class CompositionNavigationContentLoader : INavigationContentLoader { [ImportMany(typeof(IView))] public IEnumerable<ExportFactory<IView. // Get navigation values. referred to as metadata.DataContext = viewModel. var closableViewModel = viewModel as IClosableViewModel. targetUri. var viewFactory = viewMapping. } public LoadResult EndLoad(IAsyncResult asyncResult) { return new LoadResult((asyncResult as CompositionNavigationAsyncResult).OriginalString. This is a perfect example of the support for extensibility in Silverlight.Host.ViewModelContract).Value as IViewModel.Navigate( "Customer/{0}". var relativeUri = new Uri("http://" + targetUri.Format("Unable to navigate to: {0}. IViewMetadata>> ViewExports { get. AsyncCallback userCallback.FirstOrDefault(o => o. viewModel. view). if (view is ChildWindow) { ProcessChildWindow(view as ChildWindow. SelectedCustomer. This attribute doesn’t do anything special. var values = viewModelMapping.Metadata. the following topic—extending MEF—will become clear.". Uri currentUri) { return true.Close().".com In this case it would be interesting to use the Export attribute with some extra configuration. IViewModelMetadata>> ViewModelExports { get. let’s assume you have a ViewModel showing all of your customers and this ViewModel should be able to open a specific customer.OriginalString)).OriginalString)).GetArgumentValues(targetUri). userCallback(result).CloseView += (s. viewModel). if (view is Page) { Page page = view as Page. } public void CancelLoad(IAsyncResult asyncResult) { return.Show().Result). set. } } private void ProcessChildWindow(ChildWindow window. var result = new CompositionNavigationAsyncResult(asyncState. " + "Could not locate the View. } else { // Navigate because it's a Control. find a matching ViewModel and a matching View. UriKind. Uri currentUri. and combine them. // Get the factory for the View.OnLoaded(). } // Show the window. let’s discuss the SetContentLoader method in the CompositionNavigationBehavior. page.Title = viewModel. viewModel. e) => { window. var viewModelMapping = ViewModelExports.Metadata. // Resolve both the View and the ViewModel. This could be achieved using the following code: [Import] public INavigationService NavigationService { get. return result.CreateExport(). It changes the msdnmagazine. set. return null. } } Now. } private void OnOpenCustomer() { NavigationService.Metadata. if (closableViewModel != null) { closableViewModel. } // Do not navigate if it's a ChildWindow.FirstOrDefault(o => o. // Attach ViewModel to View. Now that you can see how things start falling into place. view. var viewModel = viewModelFactory. Figure 2 shows an example of a metadata attribute. } public IAsyncResult BeginLoad(Uri targetUri. if (viewMapping == null) throw new InvalidOperationException( String.Value as Control.OnNavigated(values). object asyncState) { // Convert to a dummy relative Uri so we can access the host.GetTitle(). var view = viewFactory. } ContentLoader of the Frame. }. Back to Extending MEF The goal here is that you can navigate to a certain path (be it from the ViewModel or your browser address bar) and the CompositionNavigationLoader does the rest. } [ImportMany(typeof(IViewModel))] public IEnumerable<ExportFactory<IViewModel. It should parse the URI.Key. // Get the factory for the ViewModel.ViewModelContract == viewModelMapping. window.Title = viewModel.OrdinalIgnoreCase)). it allows you to define a navigation path such January 2011 43 . var viewMapping = ViewExports. ICustomersViewModel This will give you a list that contains all exported ViewModels with their respective metadata. AsyncCallback userCallback. "Customers")] public class CustomersViewModel : ContosoViewModelBase. Uri currentUri.Metadata. The most important parts of this class are the ViewExports and ViewModelExports properties.Figure 4 Extension Methods for Navigation Arguments as Customer/{Id}. And finally. However. let’s take a look at controlling what’s loaded when a user navigates. This is where the magic happens. Here’s an example of how this attribute is used: [ViewModelExport(typeof(ICustomerDetailViewModel). IViewMetadata { public Type ViewModelContract { get. IViewModelMetadata>> ViewModels { get. Key. The ViewModelExports property is an enumeration that contains all the exports with their metadata. That’s it for exporting the ViewModels. because this method should return an AsyncResult containing the item that will be displayed in the Frame. there’s the BeginLoad method. The first thing to do is find the correct ViewModel.Equals("Customers". Figure 3 shows the implementation of the custom INavigationContentLoader. set. Then the following code will find the right ViewModel: var viewModelMapping = ViewModelExports.Class. This is required to be able to import this class in the CompositionNavigationBehavior. The first thing to notice is the Export attribute. LoadResult EndLoad(IAsyncResult asyncResult). Uri currentUri). Once you have this you can get to work. In the case of my example. } The most important part of the interface is the BeginLoad method. IView { public AboutView() { InitializeComponent(). StringComparison. the following interface needs to be implemented: public interface INavigationContentLoader { IAsyncResult BeginLoad(Uri targetUri. If you want to import them somewhere. the Lazy object will make sure that only the ones of interest are actually instantiated.OrdinalIgnoreCase)). this would be IViewModel. imagine you navigate to Customers. void CancelLoad(IAsyncResult asyncResult). Instead of using a Lazy object I’m using an ExportFactory. To create a custom content loader. allowing you to enumerate the list and maybe pick out only the ones of interest to you (based on the metadata). object asyncState). AllowMultiple = false)] public class ViewExportAttribute : ExportAttribute. the same thing should happen for the View. either. Second. As you can see. instead of looking for the navigation key. a lot happens in this class—but it’s actually simple. Then it will process this path using Customer as Key and {Id} as one of the arguments. This will be what you’ll find in the targetUri argument of the BeginLoad method. Creating and Processing Objects Let’s say you tell the frame to navigate to Customers. This is the method that will provide the Frame with the content to display after navigating to a given URI. } public ViewExportAttribute() : base(typeof(IView)) { } } Well. but the difference is that with the Lazy class you can only create a single instance of the object. you look for the ViewModelContract as defined in both the ViewModelExportAttribute and the ViewModelAttribute: A Custom INavigationContentLoader Now that the overall architecture has been set up. } Figure 5 Setting a Custom Window Title Silverlight Exposed 44 msdn magazine . mind the constructor of the attribute—it passes a type to the base constructor. you should be writing something like this: [ImportMany(typeof(IViewModel))] public List<Lazy<IViewModel. The class that’s decorated with this attribute will be exposed using this type. The View will need something similar: [MetadataAttribute] [AttributeUsage(AttributeTargets.FirstOrDefault(o => o. set. including their metadata. These enumerations contain all exports for the Views and the ViewModels. your attribute should implement an interface with the values you’ll want to expose as metadata. The ExportFactory (named after the Factory pattern) is a class that allows you to request a new instance of the type of object whenever you feel like it. In fact. This is a huge difference! Both classes will only instantiate the object when required. First. Finally. Instead of using a Lazy object I’m using an ExportFactory. there are a few important things to note. This attribute allows you to set the contract of the ViewModel to which the View should be linked. Using a lambda expression you can find the correct ViewModel based on its key. your attribute should be decorated with the [MetadataAttribute] to work correctly. There’s nothing special in this example. Here’s an example of AboutView: [ViewExport(ViewModelContract = typeof(IAboutViewModel))] public partial class AboutView : Page. "Customer/{id}")] public class CustomerDetailViewModel : ICustomerDetailViewModel Before continuing. Remember the following: [ViewModelExport(typeof(ICustomersViewModel). } } Once the ExportFactory is located. bool CanLoad(Uri targetUri. Close(). e) => { window.. view = viewFactory. for example./#Employee/DiMattia) • Navigating to a ChildWindow by passing arguments (. but only when everything has been loaded correctly. For a deeper understanding.OnNavigated(values). • Examples of ViewExport and ViewModelExport configuration This article should have provided a good idea of how the sample works. This proves again that the metadata concept in MEF is really useful. Using the Example Code The code download for this article contains an MVVM application using this type of custom navigation with MEF. as shown in Figure 5. it’s possible to call the OnNavigated method after resolving the ViewModel: // Get navigation values. this path is parsed and the arguments are combined in the NavigationArguments class (this is just a Dictionary with extra methods like GetInt. play with the code and customize it for your own applications. and also that all Imports—if there are any—have been imported. When the ViewModel raises the CloseView event. for example: public override void OnNavigated(NavigationArguments args) { var id = args. Passing Arguments to the ViewModel Take a look at the IViewModel interface. } // Show the window..CreateExport().[ViewExport(ViewModelContract = typeof(IAboutViewModel)) public partial class AboutView : Page Once both ExportFactories are found..Value). When the CustomersViewModel wants to open a CustomerDetailViewModel. To find the arguments. You’ll see how powerful and flexible MEF can be. This allows you to dynamically set the titles of your Pages and ChildWindows based on the current customer. If you’re using an ImportingConstructor you definitely know when all Imports were imported (that would be when the constructor is called).Value as IViewModel. the CloseView event of this ViewModel should be linked to the Close method on the ChildWindow. } If the View isn’t a ChildWindow. viewModelFactory = viewModelMapping. void OnNavigated(NavigationArguments args).GetCustomerById(id. }. Because it’s mandatory that each ViewModel implements the IViewModel interface.. and pay attention to the OnNavigated method: public interface IViewModel { void OnLoaded(). In this case the OnLoaded method solves this issue for you. When you navigate to Customers/1. the title of this control will also be extracted from the IViewModel object. But when working with the Import/ImportMany attributes. Processing the ChildWindow is also trivial. And the OnLoaded method of the ViewModel is called to notify the ViewModel that all the heavy lifting has been done./#Customer/1) • Imports of the INavigationService. If the View is a ChildWindow the window should be displayed. I wrote a class containing two extension methods that do some work based on the information in the ViewModel metadata (see Figure 4).GetInt("Id").Id). In many cases you’ll want to do something when creating a ViewModel. GetString and so on). var values = viewModelMapping.HasValue) { Customer = DataService.com . The solution contains the following examples: • Navigating to a regular UserControl • Navigating to a regular UserControl by passing arguments (. you should start writing code in all your properties to set flags in order to know when all properties have been imported. there’s one last step.Metadata. Now you’ve seen the whole process of how the View and ViewModel are constructed.CreateExport(). After all these great little things.Value as Control. This will show the View in the Frame.sandrinodimattia. then it should simply be made available in the IAsyncResult. This allows you to indirectly connect the ViewModel to the View: // Close the ChildWindow if the ViewModel requests it. There.Show(). the following happens: NavigationService. if (closableViewModel != null) { closableViewModel. The IClosableViewModel interface is simple: public interface IClosableViewModel : IViewModel { event EventHandler CloseView..Navigate("Customer/{0}".GetArgumentValues(targetUri). window. var closableViewModel = viewModel as IClosableViewModel. the Close method of the ChildWindow gets called. } } THANKS to the following technical experts for reviewing this article: Glenn Block and Daniel Plaisted January 2011 45 msdnmagazine. SANDRINO DI MATTIA is a software engineer at RealDolmen and has a passion for everything that is Microsoft. starting the required data bindings. if (id. } After both the View and the ViewModel have been created. viewModel = viewModelFactory. the hard part is over.CloseView += (s.net. The Final Chores If the View is a Page or a ChildWindow.. the ViewModel is stored in the DataContext of the View. viewModel. . the IDataService. Now you’ve seen the whole process of how the View and ViewModel are constructed. string GetTitle(). He also participates in user groups and writes articles on his blog at blog. SelectedCustomer. These arguments then arrive in the CustomerDetailViewModel and can be used to pass to the DataService. But if the ViewModel implements IClosableViewModel. You shouldn’t underestimate this last step when you’re using the Import and ImportMany attributes. Now the CreateExport method allows you to create a new instance of the View and the ViewModel: var var var var viewFactory = viewMapping. . . Results Summary This article will show that for a representative benchmark search problem: • You can successfully exploit multicore 64-bit systems to improve performance in many data-processing problems. This article discusses: • Evaluation criteria for alternative parallelism strategies • Optimizing a benchmark data search problem • A performance comparison of different approaches Technologies discussed: PLINQ. The native code is ungainly. The alternative solutions are: • Parallel Language Integrated Query (PLINQ) and C# 4. The source code for all solutions is available on my Web site ( jmhartsoftware.0. Comparing and Evaluating Alternative Solutions The solution evaluation criteria.NET Framework Code download available at: jmhartsoftware.com). and PLINQ can be part of the solution. ease of maintenance and similar intangible factors. are not examined directly. with and without enhancements to the original code. and the C#/.com 48 msdn magazine . • Code simplicity. • Competitive.0” (Addison-Wesley. “LINQ to Objects Using C# 4. However.NET and native code solutions are fastest. I’ll compare several distinct Windows-based methods to solve problems with a high degree of data parallelism.NET Framework multithreaded code. elegance. although PLINQ is layered on the TPL. threads and memory-mapped files. • Windows C#/Microsoft . 2010). • All methods scale well with file size up to the limits of the test system physical memory. whereas the other solutions scale well up to six tasks with six cores (the maximum tested). • Windows native code using C. and a number of practical problems are inherently parallel. Hart Processing data collections is a fundamental computing task. and it doesn’t scale beyond two tasks. scalable PLINQ performance requires indexed data collection objects. are: • Total performance in terms of elapsed time to complete the task. supporting the IEnumerable interface isn’t sufficient. core count and data collection size. C#. Other parallelism techniques. the Windows API. The benchmark I’ll use for this comparison is a search problem (“Geonames”) from Chapter 9 of Troy Magennis’ book.PA R A L L E L C O M P U T I N G Data Processing: Parallelism and Performance Johnson M. • The PLINQ code is the simplest and most elegant in all respects because LINQ provides a declarative query capability for memory-resident and external data. • The original PLINQ solution is slower by a factor of nearly 10. • Scalability with the degree of parallelism (number of tasks). such as the Windows Task Parallel Library (TPL). • The C#/. Microsoft . potentially enabling improved performance and throughput on multicore systems. in order of importance. code enhancements improve the original solution significantly.NET code is considerably better than—but not as simple as—the PLINQ code. however.org/wiki/Hyper-threading).” or DoP (the number of parallel tasks.80 GHz.651MB 3. such as data produced by a prior program step. country and elevation. every record could be processed concurrently.000 meters) is small. 18 16 14 12 Time (s) 10 8 6 4 2 0 1 2 3 4 5 6 7 8 Original Helper MMChar MMByte Index Multithreaded Degree of Parallelism (DoP) Figure 1 Geonames Performance as a Function of Degree of Parallelism required to load the file initially. Geonames Original.1 seconds (two cores).00 6. I’ll present results for three other systems with hyper-threading. • The file size is modest by today’s standards. using more than six tasks degraded performance. but it’s simple to test with larger files simply by concatenating the Geonames allCountries. • There’s a high degree of data parallelism. and different core counts.aspx?p=1606242) shows that files of this size can be processed in a few seconds and performance scales well with the core count.The Benchmark Problem: Geonames The idea for this article came from Chapter 9 of Magennis’ LINQ book. further enhancements.00 7. This is Magennis’ original PLINQ solution. Geonames Helper. see my article “Windows Parallelism.com Figure 2 Geonames Performance as a Function of File Size January 2011 49 . Therefore.00 Index Native . which involves examining every byte.00 4.848 meters. and it’s easy to generalize the query. The implementations are (fuller explanations will follow): 1.00 1.00 2. I’ll mention the time msdnmagazine.000 people). and the lines need to be processed to identify the individual fields. but the implementations control the DoP.00 0. 9.302MB A number of practical problems are inherently parallel. The benchmark program does read a file. or HT (en.00 8. do yield performance nearly as good as native and C# multithreaded code.000 meters. This is a performance-enhanced version of Geonames Original. Performance Comparison The first test system is a six-core desktop system running Windows 7 (AMD Phenom II. This benchmark is interesting for several reasons: • The subject (geographical places and attributes) is inherently interesting. Six tasks are optimal. Note: The UTF-8 encoding assures that a tab (0x9) or line feed (0xA) value won’t occur as part of a multi-byte sequence. Another Assumption: The performance results represent time required to process memory-resident data collections. An Assumption: Assume that the number of identified records (in this instance. Magennis reports elapsed times of 22. All tests use the original Geonames 825MB allCountries. However. but the test programs are run several times to assure that the file is memory-resident. The initial PLINQ enhancements almost doubled the performance but didn’t improve scalability. and this time is approximately the same for all solutions.txt data file. so that sorting and display time are minimal relative to the overall processing time.25 million place names in an 825MB file (that’s more than one location for every 1. in principle.00 5. 2. with elapsed time (seconds) as a function of “Degree of Parallelization. it’s necessary to determine the line and field boundaries in order to partition the file. Later.wikipedia. this is essential for several implementations.NET Th 825MB 1. there are 16 such locations. Each place name is represented by a UTF-8 (en. and Mt. 4GB RAM). I decided to attempt to replicate that experience and also try to enhance Magennis’ PLINQ code for better performance. Figure 1 shows the results for six different Geonames solutions.org/wiki/UTF-8) text line record (variable length) with more than 15 tab-separated data columns. • The processing is not stateless. sorted by decreasing elevation. Fast File Searching and Speculative Processing” at informit. the test system has six cores. Everest is the highest at 8.wikipedia. Previous experience (for example.00 3. locations higher than 8. 2. In case you’re wondering. displaying the location name. Magennis’ Geonames program implements a hard-coded query to identify all locations with an elevation (column 15) greater than 8.com/articles/article. which can be set to higher than the number of processors). The performance isn’t competitive and doesn’t scale with processor count. which demonstrates PLINQ by searching a geographic database containing more than 7.txt file to itself multiple times.3 seconds (one core) and 14. these values are hardcoded in the original. 4. The test system in this case has four cores (AMD Phenom Quad-Core.located in {2}". This solution and Geonames Native provide the best parallelism scalability. Figure 2 shows performance using DoP 4 and larger files. foreach (var x in q) { if (x != null) { String[] fields = x. including descriptions of the enhancements: • Lines 9-14 allow the user to specify the input file name and the degree of parallelism (the maximum number of concurrent tasks) on the command line. 3. Figure 2 shows just the three fastest solutions. but they fail for larger files (see Figure 2). fields[elevationColumn]. 5. The performance is faster than Index (the next item) and about the same as Native. This is the C Windows API implementation using threads and a memory-mapped file as in Chapter 10 of my book. • Lines 16-17 start to read the file lines asynchronously and implicitly type lines as a C# String array. This PLINQ solution preprocesses the data file (requiring about nine seconds) to create a memoryresident List<byte[]> object for subsequent PLINQ processing. use a different class with its own ReadLines method. Geonames Threads doesn’t use PLINQ. Following that. and these code lines are the only lines that need to be changed.CurrentDirectory. 50 msdn magazine The Enhanced PLINQ Solutions: Geonames Helper Figure 3 shows Geonames Helper with my changes (in bold face) to the Geonames Original code. the default optimization provides only about half the performance.WriteLine("Geographical data file: {0}. 64-bit systems to improve performance in many data-processing problems. As many readers may not be familiar with PLINQ and C# 4.Length >= 2) degreeOfParallelism = int.Parse(args[1]). if (args. fields[countryColumn]). 6. fields[nameColumn]. Console. and PLINQ can be part of the solution. You can successfully exploit multicore.Figure 3 Geonames Helper with Highlighted Changes to the Original PLINQ Code 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 class Program { static void { const int const int const int Main(string[] args) nameColumn = 1. if (args. giving performance that’s only slightly slower than Geonames Native and Geonames Threads. Geonames MMChar. The performance is the best of the first four solutions and is more than double that of Geonames Original. including Geonames Index—the fastest PLINQ solution (not counting file preprocessing)—and performance scales with the file size up to the limits of physical memory. such as Geonames MMByte.ExtractIntegerField(line. The preprocessing cost can be amortized over multiple queries. and it can provide performance benefits. I’ll discuss results on other test systems and summarize the findings. var lines = File. This was an unsuccessful attempt to enhance Geonames Helper with a memory-mapped file class similar to that used in Geonames Threads. degreeOfParallelism). The code is similar to SQL.Length >= 1) inFile = args[0]. All implementations are 64-bit builds. var q = from line in lines.". int degreeOfParallelism = 1. inFile)). Geonames Native (not shown in Figure 1) doesn’t use PLINQ.Combine( Environment. Console.40 GHz. • Lines 19-27 are LINQ code along with the PLINQ AsParallel extension. inFile. This solution modifies MMChar to process the individual bytes of the input file. Geonames Index.0. The lines values aren’t used until Lines 19-27. String inFile = "Data/AllCountries. The larger files were created by concatenating multiple copies of the original file. elevationColumn) where elevation > 8000 // elevation in meters orderby elevation descending select new { elevation = elevation. “Windows System Programming” (AddisonWesley. Notice that PLINQ perParallel Computing . This is the C#/. Geonames MMByte. countryColumn = 8. } } } } 7. thisLine = line }.ReadLines(Path. Note: Memory mapping allows a file to be referenced as if it was in memory without explicit I/O operations. Full compiler optimization is essential for these results. 2010).WithDegreeOfParallelism(degreeOfParallelism) let elevation = Helper. whereas the previous three solutions convert UTF-8 characters to Unicode (at 2 bytes each).NET implementation using threads and a memory-mapped file. 2. and the variable “q” is implicitly typed as an array of objects consisting of an integer elevation and a String. 32-bit builds work in most cases. I’ll now describe implementations two through seven and discuss the PLINQ techniques in more detail. Other solutions.Split(new char[] { '\t' }).WriteLine("{0} ({1}m) .AsParallel().txt". Degree of Parallelism: {1}.thisLine. 8GB RAM). elevationColumn = 15. I’ll provide a few comments about Figure 3. . All other product and brand names are trademarks and/or registered trademarks of their respective holders. All rights reserved. .© 1987-2010 ComponentOne LCC. . and the code uses arrays rather than pointer dereferencing. Is Figure 4 Geonames Helper Class and ExtractIntegerField Method 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 class Helper { public static int ExtractIntegerField(String line. which supports the IEnumerable<String> interface. There is.Combine( Environment. Reset the value and quit. int fieldNumber) { int value = 0.ReadLines(Path. As with MMByte and MMChar. in fact some code is the same or nearly the same. MMChar requires a new class. However.NET Threads results. There’s no obvious way to determine the line boundaries without scanning the entire file. The only significant code change is to Figure 3.CurrentDirectory. however. The . Lines 16-17. Helper. Geonames MMByte. It simply extracts and evaluates the specified field (elevation in this case). The original program uses the String. This means that a thread will probably scan into the next partition in order to complete processing of the last line that starts in the partition. the file memory mapping requires “unsafe” classes and C#/native code interoperability in order to use pointers to the mapped memory.forms all the thread-management work. { value = 0. The performance is much better than all PLINQ implementations except Geonames Index. however. The basic code patterns are described in Chapter 10 of “Windows System Programming.NET features greatly simplify the coding. an important distinction between the Geonames problem and a simple. the solution is straightforward when illustrated with DoP 4: 54 msdn magazine Parallel Computing . Geonames Threads Geonames Threads uses the same logic as Geonames Native. containers and other C#/. The FileMmByte class is similar and deals with byte[] objects rather than String objects. All the pointer usage is. This is the key to the improved performance of Geonames Helper compared to Geonames Original. • Line 20. • Divide the input file into four equal partitions. however. Note that the FileMmChar and FileMmByte classes are “unsafe” because they create and use pointers to access the files. Geonames Index The PLINQ results (Original. as it’s no longer necessary to allocate String objects for every field in every line. if (iField > fieldNumber) break. extension methods. break. isolated in a separate assembly. } } Geonames Native Geonames Native exploits the Windows API. Figure 4 shows the Helper.ExtractIntegerField method. } else { if (iField == fieldNumber) { digit = (byte)(ch . and they use C#/native code interoperability.NET Framework 4 MemoryMappedFile class doesn’t help. Figure 4 shows the Helper class ExtractIntegerField method. MMChar and MMByte) are disappointing compared to the Native and .” The program must manage the threads directly and must also carefully map the file to memory. • Have each thread then process all lines (records) that start in the partition. // 0x30 is the character '0' if (digit >= 0 && digit <= 9) { value = 10 * value + digit. byte digit. with the partition start location communicated to each thread as part of the thread function argument. the AsParallel method is all that’s needed to turn serial LINQ code into PLINQ code. avoiding library methods for performance. } } } } return value. however. as the input file bytes aren’t expanded to Unicode. because it’s necessary to use accessor functions to move data from mapped memory. The challenge is to determine the correct method to partition the input data so as to assign different partitions to different tasks. using a custom class. Figure 1 shows that this enhancement doubles the performance with DoP 1.0x30). } else // Character not in [0-9]. lambda expressions. Geonames MMChar and Geonames MMByte Geonames MMChar represents an unsuccessful attempt to improve performance by memory mapping the input file. iField = 0. The code for public static IEnumerable<byte[]> ReadLines(String path) that supports the IEnumerable<byte[]> interface in FileMmByte constructs a FileMmByte object and an IEnumerator<byte[]> object that scans the mapped file for individual lines. // Skip to the specified field number and extract the decimal value. The effort is worthwhile. because Geonames Threads performance is the same as Geonames Native performance with much simpler code. threads and file memory mapping. As I mentioned earlier. FileMmChar. does yield significant benefits. However. so it isn’t feasible to assign a fixed-size partition to each task. which are now: var lines = FileMmByte. FileMmChar. Note that the AsParallel method used in Line 19 can be used with any IEnumerable object. stateless file search or transformation. foreach (char ch in line) { if (ch == '\t') { iField++. inFile)). The challenge is to determine the correct method to partition the input data so as to assign different partitions to different tasks.Split method in a manner similar to that used to display the results in Line 33 (Figure 3). . this system doesn’t support hyper-threading. assume that partitioning may be the cause of the PLINQ performance problem. the IEnumerator.. However. 8GB RAM • Intel i3 530. As a working hypothesis. perhaps during generation in an earlier program step. var q = from line in lineListByte. eliminating the preprocessing cost. Use the list object in Lines 16-17. In the Geonames example. and today’s memory sizes and prices allow significant data objects. so the tested DoP values are 1. which shows the preprocessing: // Preprocess the file to create a list of byte[] lines List<byte[]> lineListByte = new List<byte[]>(). What would happen if the lines String array were indexed? Could we improve PLINQ performance. such as a file of 7. to be memory-resident. eight threads. Eight Threads. and then add the line to a list. Windows 7.. just access each line... In general. performance is also best when the data files are “live.302MB file (four copies of the original file) could be processed on the 8GB test system.NET Th Note that it’s slightly more efficient to process the data further by converting the list to an array. 2. Windows XP64.MoveNext methods for the FileMmChar and FileMmByte classes are slow because they need to scan every character until the next new line is located. 2.. 1..” Furthermore. 276-279).80GHz.. // .there a way to exploit the simplicity and elegance of PLINQ without sacrificing performance? Although it’s impossible to determine exactly how PLINQ processes the query (Lines 16-27 in Figure 3). 2010]. Geonames IndexFields. Limitations The efficient solutions all require memory-resident data. 4GB RAM).93GHz. although this increases the preprocessing time. four threads. was very slow with all the solutions.Add(line). The implementation. var lines = FileMmByte. Four Cores. the results in this article apply to memory-resident data structures.CurrentDirectory.. eight threads.AsParallel(). Chapter 19). Figure 1 is based on a test system with six AMD cores. 2. and the performance advantages don’t extend to very large data collections. lines isn’t indexed.ReadLines(Path.60GHz. In summary. As mentioned earlier. . WithDegreeOfParallelism(degreeOfParallelism) Additional Test System Results Figure 5 shows test results on an additional system (Intel i7 860. and in this code snippet. A Final Performance Enhancement Geonames Index performance can be improved still further by indexing the fields in each line so that the ExtractIntegerField method doesn’t need to scan all the characters in a line out to the specified field. it’s probable that PLINQ has no good way to partition the input lines for parallel processing by separate tasks.Combine(Environment. modifies the ReadLines method so that a returned line is an object containing both 56 msdn magazine Figure 5 Intel i7 860. This results in about a 33 percent performance improvement over Geonames Index and brings the performance fairly close to the native and C#/. it’s now much easier to construct more general queries as the individual fields are readily available.) Furthermore.25 million place names. 8.NET solutions. foreach (byte[] line in lines) { lineListByte. Paging in the data file during the initial run can take 10 or more seconds and is comparable to the indexing operation in the earlier code snippet. Two additional test configurations produced similar results (and the full data is available on my Web site): • Intel i7 720. four cores. PLINQ provides an excellent model for processing in-memory data structures. four cores. the lines String array supports the IEnumerable<String> interface (see also John Sharp’s book.. yielding results comparable to native code. The up-front indexing operation is simple.” in the sense that they’ve been accessed recently and are likely to be in memory. A test with eight concatenated copies of the file. 4GB RAM Interesting performance characteristics include: • Geonames Threads consistently provides the best perfor16 14 12 Time (s) 10 8 6 4 2 0 1 2 3 4 5 6 7 8 Degree of Parallelism (DoP) Original Helper MMChar MMByte Index .. however. either there’s a necessary up-front cost to move the lines into an in-memory list or array. 4GB RAM Parallel Computing . The processor supports hyper-threading. as illustrated in Figure 3. especially when supplemented by memory mapping the input file? Geonames Index shows that this technique does improve performance. which is indexed (the cost can then be amortized over multiple queries). Windows 7. “Microsoft Visual C# 2010 Step by Step” [Microsoft Press. } // .. 2.. one at a time. “Very large” in this case refers to data sizes that approach the system physical memory size. two cores. so PLINQ probably uses “chunk partitioning. or the file or other data source is already indexed. the 3.80GHz. (Geonames IndexFields is included with the code download. From Magennis’ book (pp. inFile)). a byte[] array and a uint[] array containing the locations of each field. however. Windows 7. Multiple queries can use lineListByte // . Sierra Atlantic Inc.zip. Hewlett-Packard Co. engineering manager and architect at Cilk Arts Inc. The approach has been to use standard interfaces as they’re described and to assume a simple shared-memory model for the processors and threads. Fifth Edition” (Microsoft Press. 2009).MoveNext and Current methods didn’t show any significant benefit. 2010). None of the simple enhancements provide performance close to that of native or multithreaded C#/. no significant effort to probe deeply into the underlying implementations or specific features of the test machines. The ZIP file also contains a spreadsheet with the raw performance data used in Figures 1. all PLINQ solutions scale negatively with the DoP for DoP greater than two.mance. none of the simple enhancements provide performance close to that of native or multithreaded C#/. following these instructions: msdnmagazine. and it’s possible to improve existing code performance with some simple changes (for example.html ) to download the Geonames Native code and projects (a ZIP file). DoP and implementation. 40 of their book. performance decreases as the number of parallel tasks increases.NET Framework application architecture and development.org/ export/dump/allCountries. however. and Apollo Computer. • Go to the Windows System Programming support page ( jmhartsoftware. However. Figure 1 shows that the best performance occurs when the DoP is the same as the core count. Geonames Threads is in the GeonamesThreads Visual Studio 2010 project. He served as computer science professor for many years and has authored four editions of “Windows System Programming” (Addison-Wesley. • The hyper-threading performance contribution is marginal. Troy Magennis and CK Park January 2011 57 . technical training and writing. From this example. “Windows Internals. where the multithreaded performance is 4.. All PLINQ variations are in the GeonamesPLINQ project (Visual Studio 2010. Furthermore. as was shown with MMByte.2 times the DoP one performance. Russinovich. The projects are both configured for 64-bit release builds. Native and Index.html). 2 and 5. but it requires the use of indexed data objects. Unexplored Questions This article compared the performance of several alternative techniques to solve the same problem. approaching the Geonames Threads performance. • Geonames Index is the fastest PLINQ solution. Using the Code and Data All code is available on my Web site (jmhartsoftware. compared to the sequential DoP one results for Threads. Results Summary PLINQ provides an excellent model for processing in-memory data structures. David A. HART is a consultant specializing in Microsoft Windows and Microsoft . Geonames Threads and Geonames Index performance doesn’t increase significantly for DoP greater than four. (since acquired by Intel Corp.NET code. This poor HT scalability might be a consequence of scheduling two threads on logical processors on the same core. The AMD systems without HT (Figure 1) provided about three to four times the performance with DoP greater than four. Andrew Greenwald. JOHNSON (JOHN) M.geonames. PLINQ can come close to native and C#/. • Go to the page to download a ZIP file containing PLINQ code and Geonames Threads code. and is there any method to reduce the impact? • What would be the impact of solid-state disks? • Is there any way to reduce the performance gap between the PLINQ Index solution and the Threads and Native solutions? Experiments reducing the amount of data copying in the FileMmByte IEnumerator.NET code. Solomon and Alex Ionescu say that physical processors are scheduled before logical processors on p. He has many years of experience as a software engineer. rather than assuring that they run on distinct cores when possible.com/SequentialFile ProcessingSupport. and numerous questions could be investigated in the future. • Other than Geonames Index. Visual Studio 2010 Express is sufficient).txt file explains the structure.). Therefore.NET code performance. where you’ll find the Geonames project. Helper) or with more advanced techniques. Here are some examples: • What’s the effect of cache misses. Note: Geonames IndexFields is slightly faster but isn’t shown in Figure 5. There was. A ReadMe. However. A simple “Usage” comment at the head of the file explains the command-line option to select the input file. the enhancement doesn’t scale with the core count and DoP.com/comments_updates. • Download the GeoNames database from download. this explanation doesn’t appear to be plausible as Mark E. • Is the performance close to the theoretical maximum as determined by memory bandwidth. that is. CPU speed and other architectural features? • Is there a way to get scalable performance on HT systems (see Figure 5 ) that’s comparable to systems without HT (Figure 1)? • Can you use profiling and Visual Studio 2010 tools to identify and remove performance bottlenecks? I hope you’re able to investigate further. along with Geonames Native. PLINQ produces good performance only when used with indexed objects.com THANKS to the following technical experts for reviewing this article: Michael Bruestle. VISUAL STUDIO Use Multiple Visual Studio Project Types for Cloud Success Patrick Foley As you’ve probably noticed. ASP.” or CSS. ASP. I’ll walk through a simplified example of my project called “Customer Success Stories.NET Dynamic Data for the administrative site Any of these technologies could’ve been used to create the whole solution.NET Dynamic Data Code download available at: code. The public site has a marketing purpose. but I preferred to take advantage of the best features of each. such as iPhones and iPads. and I wanted the greatest .NET Dynamic Data project • Creating a Windows Azure service project • Creating an ASP. Although ASP. This article discusses: • Creating an Entity Framework data model • Creating an ASP.NET MVC for the public site 2.com). WCF RIA Services for the private. I was confronted with a real-world example of such a problem recently while building out the infrastructure for a cloud-based program I lead that’s designed to highlight success stories: Microsoft Solutions Advocates (microsoftsolutionsadvocates.microsoft. there are many different project types in Visual Studio these days. Users who belong to the program log in to a private Web site to create and edit their own success stories. WCF RIA Services. Making the site read-only and separating it from the other use cases helps isolate the scope of the designer’s involvement. WCF RIA Services could be used to build a great-looking public Web site. WCF RIA Services is an even better fit.NET MVC.com/mag201101VSCloud 58 msdn magazine CSS has three distinct use cases: 1.NET MVC is an ideal technology for creating a public Web site that will work everywhere. Administrators (like me) log in to an administrative Web site to manage and edit all the data. but Silverlight isn’t supported on some devices.NET MVC has a straightforward view implementation that makes it easy to incorporate a designer’s vision.NET MVC project • Creating a WCF RIA Services project • Adding the final touches for a complete solution Technologies discussed: ASP. Even within a single business problem. Working with a designer adds complexity. 3. ASP. so I’ll eventually engage a designer to polish the appearance. but ASP. there are often multiple use cases that can best be solved by different Visual Studio project types.msdn. (Conversely. including minutia such as lookup tables.NET MVC could also be used to implement the customer-editing functionality. customer-edit site 3. ASP.NET Framework technologies: 1. and in this article. The approach I settled on combined three Microsoft . Which one do you choose? All have strengths that help solve problems in different situations. because it emits standard HTML. for one thing. I used several different Visual Studio project types to build my solution. 2. Anonymous users read success stories on a public Web site. Targeting Windows Azure Again, this example is based on a real-world problem, and because the solution requires a public Web site, I’m going to target Windows Azure and SQL Azure. Windows Server and SQL Server might be more familiar, but I need the operational benefits of running in the cloud (no need to maintain the OS, apply patches and so on). I barely have time to build the solution—I certainly don’t have time to operate it, so Windows Azure is a must for me. To work through this example and try it on Windows Azure, you need an account. There are various Figure 1 Adding Company and Story Entities to the Visual Studio Project options and packages found at reach for the public use case.) Silverlight is here to stay, and it’s microsoft.com/windowsazure/offers. MSDN subscribers and Microsoft partperfect for creating a rich editing experience with very little pro- ners (including BizSpark startups—visit bizspark.com to learn more) gramming, so long as it’s reasonable to expect users to have it or have access to several months of free resources. For prototyping install it, as would be the case with customers collaborating on a and learning (such as working through this example), you can use the “Introductory Special.” It includes three months of a 1GB SQL success-stories site. ASP.NET Dynamic Data provides a handy way to build an Azure database and 25 hours of a Windows Azure small compute administrative solution without too much work. The administrative instance, which should be enough to familiarize yourself with site doesn’t need to be fancy; it simply needs to provide a way to the platform. I built this example using the Introductory Special manage all of the data in the solution without having to resort without incurring any additional charges. to SQL Server Management Studio. As my solution evolves, the ASP.NET Dynamic Data site could conceivably be subsumed by ‘Customer Success Stories’ Overview the WCF RIA Services site. Nevertheless, it’s useful at the beginning This example is presented in the form of a tutorial. Several imporof a data-centric development project such as this one, and it costs tant aspects of a real-world implementation are out of scope for this article, including user-level security, testing, working with a almost nothing to build. designer and evolving beyond an extremely simplified model. I’ll attempt to address these in future articles or on my blog at pfoley.com. The steps are presented at a high level and assume some familiarity with Visual Studio and with the technologies involved. Code for the entire solution can be downloaded from code.msdn.microsoft.com/mag201101VSCloud, and click-by-click instructions for each step can be found at pfoley.com/mm2011jan. Step 1: Create a Project for the Entity Framework Data Model The Web technologies used in this example all use the Entity Framework effectively, so I chose to integrate the three use cases by having them all share a common Entity Framework model. When working with an existing database, you have to generate the model from the database. Whenever possible, I prefer to create the model first and generate the database from it, because I like to think about my design more at the model level than the database level. To keep things simple, this example uses just two entities: Company and Story. January 2011 59 Figure 2 Configuring Settings in the SQL Azure Portal msdnmagazine.com Figure 3 Generating the Database Model The Entity Framework model will be created in its own project and shared across multiple projects (I learned how to do this from Julie Lerman; see pfoley.com/mm2011jan01 for more on that). I call best practices like these “secret handshakes”—figuring it out the first time is a challenge, but once you know the secret, it’s simple: 1. Create the Entity Framework model in a “class library” project. 2. Copy the connection string into any projects that share the model. 3. Add a reference to the Entity Framework project and System.Data.Entity in any projects that share the model. To start, create a Blank Solution in Visual Studio 2010 named “Customer Success Stories” and add a Class Library project named “CSSModel.” Delete the class file and add an empty ADO.NET Entity Data Model item named “CSSModel.” Add Company and Story entities with an association between them as in Figure 1 (when you right-click on Company to add the association, make sure that “Add foreign key properties to ‘Person’ Entity” is checked on the ensuing dialog—foreign key properties are required in future steps). The model is now ready to generate database tables, but a SQL Azure database is needed to put them in. When prototyping is completed and the project is evolving, it’s useful to add a local SQL Server database for testing purposes, but at this point, it’s less complicated to work directly with a SQL Azure database. From your SQL Azure account portal, create a new database called “CSSDB” and add rules for your current IP address and to “Allow Microsoft Services access to this server” on the Firewall Settings tab. Your SQL Azure account portal should look something like Figure 2. In Visual Studio, right-click on the design surface and select “Generate Database from Model.” Add a connection to your new SQL Azure database and complete the wizard, which generates some Data Definition Language (DDL) and opens it in a .sql file, as shown in Figure 3. Before you can execute the SQL, you must connect to the SQL Azure database (click the Connect button on the Transact-SQL Editor toolbar). The “USE” statement is not supported in SQL Azure, so you must choose your new database from the Database dropdown on the toolbar and then execute the SQL. Now you have a SQL Azure database that you can explore in Visual Studio Server Explorer, SQL Server Management Studio or the new management tool, Microsoft Project CodeNamed “Houston” (sqlazurelabs.com/houston.aspx). Once you build the solution, you have an Entity Framework project that you can use to access that database programmatically, as well. Step 2: Create the ASP.NET Dynamic Data Project An ASP.NET Dynamic Data Web site provides a simple way to work with all the data in the database and establishes baseline functionality to ensure the environment is working properly—all with one line of code. Add a new ASP.NET Dynamic Data Entities Web Application project to the solution and call it “CSSAdmin.” To use the data model from the first step, copy the connectionStrings element from App.Config in CSSModel to web.config in CSSAdmin. Set CSSAdmin as the startup project and add references to the CSSModel project and System.Data.Entity. There are lots of fancy things you can do with ASP.NET Dynamic Data projects, but it’s surprisingly useful to Visual Studio Figure 4 Creating Services in the Windows Azure Portal 60 msdn magazine SMTP SNMP FTP SFTP POP 2IP HTTP ENTERPRISE TELNET WEB UI SSH UDP TCP SSL EMULATION Internet Connectivity for the Enterprise Since 1994, Dart has been a leading provider of high quality, high performance Internet connectivity components supporting a wide range of protocols and platforms. Dart’s three product lines offer a comprehensive set of tools for the professional software developer. PowerSNMP for ActiveX and .NET Create custom Manager, Agent and Trap applications with a set of native ActiveX, .NET and Compact Framework components. SNMPv1, SNMPv2, SNMPv3 (authentication/encryption) and ASN.1 standards supported. PowerTCP for ActiveX and .NET Add high performance Internet connectivity to your ActiveX, .NET and Compact Framework projects. Reduce integration costs with detailed documentation, hundreds of samples and an expert in-house support staff. PowerWEB for ASP .NET AJAX enhanced user interface controls for responsive ASP.NET applications. Develop unique solutions by including streaming file upload and interactive image pan/zoom functionality within a page. SSH UDP TCP SSL FTP SFTP HTTP POP SMTP IMAP S/MIME Ping DNS Rlogin Rsh Rexec Telnet VT Emulation ZIP Compression more... Ask us about Mono Platform support. Contact [email protected]. Download a fully functional product trial today! the solution doesn’t look any different than it did running on IIS or Cassini. Deploy to your Windows Azure account by rightclicking on the CSSAdminService project and selecting Publish. When updating a real-world solution in production. Once complete. deploy first to staging to make sure everything works and then promote the staging environment into production. but it’s important to run on the dev fabric anyway to catch mistakes such as using unsupported Windows APIs as you evolve a Windows Azure solution. you’ll have to budget for running Windows Azure services nonstop. Don’t delete the service itself. run your Windows Azure application using the Web site URL shown on the service page. Figure 6 Enabling IntelliTrace While Publishing to Windows Azure 62 msdn magazine it runs in the development fabric. because doing so returns the Web site URL back into the pool of available URLs. but once you get serious about developing on Windows Azure. Obviously. you’ll need to add credentials (follow the instructions to copy a certificate to your Windows Azure account). When a service deployment fails. Build and run the project.CSSModelContainer). whether or not it’s actually running). then select “Add | Web Role Project in solution” to associate the CSSAdmin project with the cloud service project. which can take several minutes. From your Windows Azure account portal. you’ll probably want to use Windows PowerShell to script deployment. which should look similar to Figure 5. It also makes it easy to publish to Windows Azure interactively. create a Storage Service called “CSS Storage” and a Hosted Service called “CSS Service. Then select a “Hosted Service Slot” and a “Storage Account” to deploy your solution to. The cloud service project adds the infrastructure necessary to run your application in a local version of the “cloud fabric” for development and debugging. In Visual Studio.Figure 5 A Windows Azure Deployed Service implement the default behavior that comes by simply uncommenting the RegisterContext line in Global. perhaps as part of a continuous integration solution. This is great for prototyping and for simpler Windows Azure solutions. Right-click on the Roles folder in CSSAdminService. Add some test data to make sure everything’s working. but don’t add additional “cloud service solutions” from the wizard. I prefer to deploy straight to production because I’m not going to leave the solution running anyway. At this point.asax. suspend and delete the deployment to avoid charges (consumption-based plans are billed for any time a service package is deployed. While prototyping. new ContextConfiguration() { ScaffoldAllTables = true }). and you have a basic site to manage your data.cs and changing it to: DefaultModel. Visual Studio . it doesn’t always tell you exactly what’s wrong.RegisterContext(typeof(CSSModel. There are two options for the hosted service slot: production and staging.” Your Windows Azure account portal should look similar to Figure 4. The first time you do this. Now when you compile and run the solution. when you’re ready to flip the switch on a real production solution. Click OK to deploy to Windows Azure. Step 3: Create the Windows Azure Services Project The result of the previous step is a local Web site that accesses a database on SQL Azure—the next step is it to get that Web site running on Windows Azure. It usually doesn’t even tell you something is wrong. After verifying that the service works. add a new Windows Azure Cloud Service project to your solution called “CSSAdminService” (you must have Windows Azure Tools for Visual Studio installed). . Treat the Entity Framework model container itself as the model. I think it makes sense to add separate controllers (with associated views) for Companies and Stories. In CompaniesController.NET MVC 2 Empty Web Application. Add a new ASP. 6. 3. Select all the references for CSSPublic and set the Copy Local property to true. 2. 5.Master. all with one line of code.cs. LogOnUserControl. The next step is to create the public Web site.NET MVC.NET MVC Web Site The service status just enters a loop such as “Initializing … Busy … Stopping … Initializing …” When this happens to you—and it will—look for problems such as attempting to access local resources (perhaps a local SQL Server database) or referencing assemblies that don’t exist on Windows Azure. and then run it to see what you’re starting with.Entity as before. Naming is relevant in a Model View Controller (MVC) project.NET MVC Project The solution so far consists of an administrative Web site (albeit with no user-level security) that runs on Windows Azure and accesses a SQL Azure database. These are important decisions. but that adds complexity. Right-click on CSSPublic to make it your startup project.” This site won’t implement any behavior—it’s read-only—so there’s no need for an explicit model class. Delete the ApplicationServices connection string and authentication element from the main Web.cs.ascx and the entire Account folder under Views. but I prefer to start from a Web site structure that already works and modify it bit by bit to make it do what I want. while keeping the Home controller as a landing page. Right-click on the Controllers folder to add a new controller named “CompaniesController. Use the standard conventions of Index. but the information architecture—the site structure—has to be right first.config. Details and so on for controller methods and view names. add “using CSSModel” and change the Index method to return a list of companies as follows: Visual Studio Figure 7 A List of Companies in ASP . Use plural controller names (Companies. Enabling IntelliTrace when Figure 8 Adding a Domain Service Class 64 msdn magazine . not Company) and matching view folders. so remove all login and account functionality with these steps: 1. 4. you might prefer to start with an ASP. Run it again to make sure it still works. Copy the connection string from the CSSModel App.you deploy the package (see Figure 6) can help you pinpoint problems by identifying the specific exceptions that are being thrown.NET MVC 2 Web Application project to the solution named “CSSPublic” (don’t create a unit test project while working through this example). If you’re already quite experienced with ASP. You can tweak these conventions if you really want to.config and add references to CSSModel and System.Data. a designer can make the site look good. Delete the “logindisplay” div from Site. Delete AccountController. AccountModels.Config into the CSSPublic Web. The public site for CSS is read-only and anonymous. Step 4: Create the ASP.cs. Run the service locally in the dev fabric and then publish the site to Windows Azure and run it from the public URL. I prefer to deploy straight to production because I’m not going to leave the solution running anyway. and it’s easy to evolve.Id })%> Framework model to access a SQL Azure database. navigate to Companies. Because this site is intended to be read-only.Entity and set Copy Local to true for all the references.Data. create a cloud service project called “CSSPublicService” and add the CSSPublic role to it. but this is a good start. but remove the unnecessary “Id” field.Companies.Companies. Don’t forget to suspend and delete the deployment when you’re done to avoid being billed.Company and View content of List.CSSModelContainer db = new CSSModelContainer().” Make sure the name of this class ends in Service—without a number—to gain the full benefit of The id parameter represents the integer following Companies\ Details\ in the URL.config. Your Companies list should now look similar to what is shown in Figure 7.Id. add a list item in the menu to reference the new Controller: <li><%: Html. Very little manual plumbing code has been required to obtain a decent amount of functionality.” Finally. In CSSCustomerEdit.ToList()). That integer is used to find the appropriate company using a simple LINQ expression.” “Delete” and “Create. delete the ActionLink entries for “Edit. Add the Details view underneath the Companies folder as before. Open ApplicationStrings. add a method to CompaniesController: public ActionResult Details(int id) { CSSModelContainer db = new CSSModelContainer(). Adding a WCF RIA Services Web site adds another dimension: a rich editing experience.NET MVC and ASP. add references to CSSModel and System. copy connectionStrings from CSSModel into Web.Web).ActionLink(item. Adding a controller and views for Stories is similar. Then right-click on the Services folder and add a new Domain Service Class item called “CSSDomainService.Single(c => c.” Make it strongly typed with a View data class of CSSModel. To create the view. make the company name itself a link to the Details view: <%: Html. return View(db. The views still require a fair amount of tweaking before being ready for a designer. Run the solution to see what you’re starting with.Master.Name.Web. } While prototyping. using a shared Entity msdnmagazine. new { id=item.resx in CSSCustomerEdit project and change the value for ApplicationName to “Customer Success Stories” to make it look nice.NET Dynamic Data Web sites running (or at least runnable) on Windows Azure.Equals(id))). Add a Silverlight Business Application project named “CSSCustomerEdit” (no spaces) and Visual Studio adds two projects to your solution: a Silverlight client (CSSCustomerEdit) and a Web service (CSSCustomerEdit.com Figure 9 XAML for the Stories View January 2011 65 . The default view is a good starting point. Step 5: Create the WCF RIA Services Project The solution at this point contains ASP. In Site. but this time select “Details” as the View content and name the view “Details. return View(db.” Run the project. To implement Details. "Index". "Companies")%></li> Run the app and click the menu to see the list of Companies.ActionLink("Companies". create an empty Companies folder under Views and right-click on it to add a view called “Index. and then click one of the company names to see the default Details view. To verify this project will work on Windows Azure. "Details". most of the magic involves editing XAML..DomainServices in CSSCustomerEdit. which means he helps soft ware companies succeed in building on the Microsoft platform. In a Silverlight Business Application. While it’s tempting to search for the “one true way” to create the next killer app for the cloud. // TODO: set myCompany during authentication public Company GetMyCompany() { return this. add a placeholder property named “myCompany” to represent the company the user is authorized to edit. it’s more rewarding to leverage capabilities from multiple technologies to solve complex problems. not a list of companies. change GetStories to GetMyStories and ensure that the user creates stories whose CompanyId is equal to myCompany: private int myCompany = 1. easier to evolve and—thanks to Windows Azure—easier to operate. the user can only edit the single company they work for. Notice that “Generate associated classes for metadata” is grayed out. Figure 10 The Stories View in Action 66 msdn magazine THANKS to the following technical expert for reviewing this article: Hanu Kommalapati Visual Studio . story. Build it … run it … it works! A rich editing experience that’s ready to evolve (see Figure 10).Windows.Single(c=>c. This illustrates a tradeoff with the approach I’m espousing here. then mimic the XAML from the existing Home and About views.com. Create new views (Silverlight Page items) for Company (singular) and Stories. In both DataForms. you might want to create your Entity Framework model directly in the “. so remove those service methods. WCF RIA Services shines in the creation of editable..CompanyId.Stories..Web.aspx to Default.. To implement a baseline UI for this example.aspx for a cleaner experience.Where(s=>s.ObjectContext.xaml. } . when the Entity Framework model is in a separate project from CSSCustomerEdit. but eventually the login process will set it to the right company for the authenticated user. Also. For now. Copy CSSCustomerEditTestPage. hardcode it to 1. In Silverlight development.Companies. It’s better to work incrementally and add one UI improvement at a time. As before. The DataGrid and DataForm controls are powerful.Controls. Click OK to bring up the Add New Domain Service Class dialog and check all the entities along with Enable editing for each (Figure 8).NET Framework provide myriad choices for creating solutions that can run on Windows Azure. However. public IQueryable<Story> GetMyStories() { return this. // database will replace with next auto-increment value . a DomainDataSource and a DataForm for the Company view. In the CSSDomainService class. but whenever I work too fast or try to add too much functionality at once. PATRICK FOLEY is an ISV architect evangelist for Microsoft. the toolkit doesn’t let you add these metadata classes. add references to System. As mentioned. add a DataGrid for the Stories view.ObjectContext. authentication and authorization are out of scope for this article. } public void InsertStory(Story story) { story. It’s easier to code.the tooling between the two projects (another “secret handshake”).Web to share the Entity Framework model in another project.Equals(myCompany)).Windows. and you’re done. Edit MainPage. which should look similar to Figure 9. create a new cloud service project. so change GetCompanies to GetMyCompany. You could still reference CSSCustomerEdit.NET Dynamic Data Web site). Edit the CSSDomainService class to reflect the specific use case for the project: the user can update companies but not insert or delete them (an administrator does that in the ASP.Id = -1. but it’s important to start simply and add functionality slowly. No ‘One True Way’ Visual Studio and the . Read his blog at pfoley. Similarly. handle the EditEnded event to call MyData. but it’s possible to punt and still be precise. I end up messing up and having to backtrack.Id. In addition. just co-opt the existing Home and About views to use for Company and Stories). metadata classes can be used to add additional validation logic such as ranges and display defaults.Equals(myCompany)).SubmitChanges.Data and System.CompanyId = myCompany. If this feature is important to you or if you know you’re going to invest most of your energy in the Silverlight Business Application part of your solution.xaml to add new dividers and link buttons (alternatively. publish it and test it on Windows Azure. Stories. In CSSCustomerEdit.Web” project instead of a separate project.Controls. } In Silverlight development. field-oriented interfaces. add namespace entries. most of the magic involves editing XAML. USA and Canada Toll Free: 1 866 926 9864 Support: (514) 868 9227 Info: [email protected] .NET or ActiveX programming language New Touchscreen Tablet for Mobile Development! ■ ■ ■ ■ v4.amyuni. with optional OCR Multiple image compression formats such as PNG. ■ Fully open customizable tablet Develop with .devtouchpro.v4. process and print PDF 1.com All trademarks are property of their respective owners. now Webform Enabled ■ ■ Edit. WPF and Web applications Support for dynamic objects such as edit-fields and sticky-notes Save image files directly to PDF.com More Development Tools Available at: www.5! PDF Editor for . © 1999-2010 AMYUNI Technologies.NET.NET components Client-side C# Silverlight 3 control provided with source-code Optimization of PDF documents prior to converting them into XAML Conversion of PDF edit-boxes into Silverlight TextBox objects Support for other document formats such as TIFF and XPS ■ ■ ■ ■ ■ Learn more at www.NET.com Europe Sales: (+33) 1 30 61 07 97 Support: (+33) 1 30 61 07 98 Customizations: [email protected]! High-Performance PDF Printer Driver ■ Create accurate PDF documents in a fraction of the time needed with other tools WHQL tested for all Windows 32 and 64-bit platforms Produce fully compliant PDF/A documents Standard PDF features included with a number of unique features Interface with any . All rights reserved. testing and deployment platform. Java or C++ Unrestricted development and flexible quantities Fully supported in North America New! PDF Integration into Silverlight Applications ■ ■ ■ Server-side PDF component based on the robust Amyuni PDF Creator ActiveX or .7 documents programmatically Fast and lightweight 32 and 64-bit managed code assemblies for Windows. JBIG2 and TIFF ■ ■ ■ The DevTouch Pro is a new color touchscreen tablet designed to provide mobile application developers with a customizable development. Technologies discussed: ASP.codeplex. (You can download and install the database from msftdbprodsamples. The ASP. The company’s data is stored in a Microsoft SQL Server 2008 database. this technology This article discusses: • Starting a Dynamic Data project • Using ASP.NET Framework and the DataSet object. you need to add an Entity Framework data model to the project. let’s say I’m building an intranet Web site for a fictional company called Adventure Works. you could connect a strongly typed DataSet to a back-end database such as SQL Server and then work with the DataSet throughout the application. In the Add New Item dialog box. To do this. choose ADO. Name it HumanResources.NET Entity Data Model. developers have wrestled with the tedious tasks of building data layers with Create.NET Routing • Supporting metadata • Customizing Dynamic Data templates also minimized the need to work directly with SQL.NET Routing—which allows an application to respond to URLs that do not physically exist.edmx. Visual Studio will then prompt you to add the model to the App_Code folder. Update and Delete (CRUD) functionality to be consumed by the UI. open Microsoft Visual Studio 2010 and create a new ASP. Getting Started Just to set up the scenario. Choose Yes and allow it to make this change.) Now. However. With the DataSet designer. N E T D Y N A M I C D ATA Build a Data-Driven Enterprise Web Site in 5 Minutes James Henry For years. Fast-forward a few more years and Microsoft introduced the Entity Framework.NET. choose Add New Item from the Web site menu.com.NET Routing and the Entity Framework to enable you to quickly create data-driven Web sites. In this article I’ll show you how. This Web site allows the company to manage employee information. I can personally remember a project from more than 10 years ago that required me to write code that automatically generated business and data layers from a Rational Rose object model. but it still required the manual creation of Web pages to display data from the database.NET Dynamic Data—a combination of the Entity Framework and ASP. It was a lot of work. Now Microsoft has introduced ASP.NET Dynamic Data Entities Web Site C# project. Read. Like the DataSet.NET Dynamic Data Entities Web Site project type takes advantage of ASP. SQL Server 68 msdn magazine . With these features you can create a production-ready. data-driven Web site in just a few minutes. it still required the manual creation of Web pages to display data. This Object Relational Mapping (ORM) framework uses LINQ to abstract a database schema and presents it to an application as a conceptual schema. To get this functionality. This minimized the need to work directly with SQL. A few years later Microsoft introduced the Microsoft .A S P. For example.aspx. as shown in Figure 1. Using ASP. Also note that the EmployeeDepartmentHistory entity joins the Employee and Department entities. Figure 4 shows seven entities that Visual Studio generated from the database schema.NET Routing allows an application to respond to URLs that do not physically exist. with both views using the same display template. This would have allowed direct navigation between the Employee and Department entities.cs.aspx Figure 1 Starting the Entity Data Model Wizard When you create Web site projects. Visual Studio automatically generates a partial class for the data context and partial classes for the entities. Choose Generate from Database and click Next. two URLs—http://mysite/Employee and http://mysite/ Department—could be directed to a page at http:// mysite/Template. The page itself could then extract information from the URL to determine whether to display a list of employees or a list of departments. On the next page you can choose all tables in the Human Resources schema. In this example.com Figure 2 Configuring the Data Connection January 2011 69 . Visual Studio dynamically compiles all code that is placed in the App_Code folder. In the case of the Entity Data Model. This means that an employee can be a candidate for multiple jobs within the company. When you click Finish. Had the EmployeeDepartmentHistory table contained only the fields that were necessary for joining the Employee and Department tables.NET Dynamic Data uses a route handler named DynamicDataRouteHandler that interprets placeholders named {table} and {action} in URL patterns. The name of the schema appears in parenthesis. Visual Studio automatically generates entities from the tables that you chose for your project.entity participates in a one-to-many relationship with the JobCandidate entity. For example. A route is simply a URL pattern that’s mapped to an ASP. it places the code in a file named HumanResources. The handler could then display You can create a production-ready. Next. data-driven Web site in just a few minutes.aspx. you need to create a new one. Figure 3 shows some of the tables that are selected. ASP. Visual Studio used the foreign key constraints in the database to create relationships between the entities. It’s the handler’s responsibility to determine how the actual URL maps to the interpreted pattern. That pattern could then be used to interpret the URL http:// mysite/Employee/List. Now choose a connection to the Adventure Works database. Visual Studio would have simply omitted the EmployeeDepartmentHistory entity. If a connection doesn’t already exist.NET Routing ASP. Figure 2 shows a connection to AdventureWorks on a SQL Server instance named Dev\BlueVision. Employee is processed as the {table} placeholder and List is processed as the {action} placeholder. the Employee msdnmagazine. For example. a handler could use this URL pattern: http://mysite/{table}/{action}.Designer. the Entity Data Model Wizard appears.NET HTTP handler. however. you can change the names of EmployeeDepartmentHistories. you should set the ScaffoldAllTables property to false to prevent the accidental exposure of the entire database to end users. The first parameter specifies the type of the data context. you allow all entities to be viewed on the site. Details. the Route instance is actually a DynamicDataRoute instance.Web. that gets called when the application first starts. This method registers the Entity Framework data context with ASP. ASP .DataAnnotations.NET Dynamic Data makes it easy to change the name of a displayed entity by applying the Display attribute to the class that represents that entity. You need to modify the line of code as follows: DefaultModel. Similarly. This is theoretically all you need to do to use ASP.Add(new DynamicDataRoute("{table}/{action}.aspx") { Constraints = new RouteValueDictionary( new { action = "List|Details|Edit|Insert" }).NET Dynamic Data.AdventureWorksEntities). as shown here: public static void RegisterRoutes(RouteCollection routes) { // DefaultModel. EmployeePayHistories and JobCandidates to more appropriate names. } Supporting Metadata One thing you should notice is that the entity names are identical to the table names that they represent. namespace AdventureWorksModel { [Display(Name = "Employee Addresses")] public partial class EmployeeAddress { } } Figure 4 Entities Generated from Database Tables in Visual Studio 70 msdn magazine Then rebuild and run the application. ASP. you should see the page template used to display the entity. new ContextConfiguration() { ScaffoldAllTables = true }). A constraint is also set to limit the uses the {table} placeholder to determine the name of the entity to {action} values to List. If you build and run the application. // new ContextConfiguration() { ScaffoldAllTables = false }). and it uses the {action} parameter to determine the page Data. Figure 3 Selecting Tables for the Human Resources Schema in Visual Studio The parameter passed to the constructor of DynamicDataRoute represents the pattern that the handler should a list of Employee entity instances. Because the entity classes are contained in a file that’s automatically regenerated by Visual Studio.cs to the App_Code folder. which is AdventureWorksEntities in this example.ComponentModel.NET Dynamic Data. By setting the ScaffoldAllTables property to true.NET Dynamic display.ComponentModel.RegisterContext(typeof(YourDataContextType). Model = DefaultModel }). The DynamicDataRoute class internally uses DynamicDataRouteHandler as its handler. Then add the following code to the file: using using using using System. Edit or Insert.asax file contains a static method named RegisterRoutes shown in Figure 5. usually this is not the case in the UI. System. you’d need to apply the ScaffoldTable(true) attribute to each entity that should be viewed on the site. System.RegisterContext( typeof(AdventureWorksModel. Therefore.) The Add method of the RouteCollection class adds a new Route instance to the route table. DynamicDataRouteHandler use to process physical URLs. You should notice that EmployeeAddresses is now Employee Addresses.The first thing you need to do is uncomment the code that calls the RegisterContext method of the DefaultModel instance. add a new code file named Metadata. Had you set this property to false.NET Dynamic Data . (In practice. The Global. In code and in the database this might be fine. routes. With ASP. System. you should make any code changes to partial classes in a separate code file. Online prices may vary from those shown due to daily fluctuations & online discounts.componentsource.02 Interactive Flash & JavaScript (HTML5) charts for web apps.59 Word processing components for Visual Studio .componentsource./update/2011/01 www.com . All prices correct at the time of press.NET for Windows Forms/WPF from $499. Contact us to apply for a credit account.NET. We accept purchase orders. BEST SELLER BEST SELLER Spread for Windows Forms from $959.US & Canada: (888) 850-9911 www. BEST SELLER FusionCharts from $195.com BEST SELLER NEW VERSION TX Text Control .NET report writer. US Headquarters ComponentSource 650 Claremore Prof Way Suite 100 Woodstock GA 30188-5188 USA European Headquarters ComponentSource 30 Greyfriars Road Reading Berkshire RG1 1PE United Kingdom Asia / Pacific Headquarters ComponentSource 3F Kojimachi Square Bldg 3-3 Kojimachi Chiyoda-ku Tokyo Japan 102-0083 Sales Hotline . BEST SELLER BEST SELLER ActiveReports 6 from $685. All Rights Reserved.02 Latest release of the best selling royalty free .04 A comprehensive Excel compatible spreadsheet component for Windows Forms applications. © 1996-2010 ComponentSource. Designer. } Figure 5 A Basic ASP .Designer. So you can’t add the same field to the Shift class defined in 72 msdn magazine Figure 7 Identifying Employees with LoginIDs ASP . This might not be useful. Also. you shouldn’t manually modify HumanResources. as well as the field’s label when it’s displayed in edit or read-only mode. You apply the DataType attribute to the field to which it applies. The Employee column displays values for the NationalIDNumber field.cs because that file gets regenerated by Visual Studio. The DataType attribute specifies the Time enumerated value to indicate that the StartTime field should be formatted as a time value. Now rebuild and run the application.NET Dynamic Data . such as EmailAddress or Time. An attribute named DataType allows you to specify a more specific data type for a field. You can’t apply the additional metadata directly to members of the Shift class because you can’t add the same class members to more than one partial class. it does have a field for an Figure 6 Revised Shifts Page Using MetadataType Definitions Next click the Shifts link. set. This displays a list of employee shifts. First. Other enumerated values include PhoneNumber.Time)] [Display(Name = "Start Time")] public DateTime StartTime { get.Metadata. Add the following code to the ShiftMetadata class: [DataType(DataType. so I’ll format the Start Time and End Time values so that they display times only.cs file and add the following class definitions: [MetadataType(typeof(ShiftMetadata))] public partial class Shift { } public partial class ShiftMetadata { } Notice that an attribute named MetadataType is applied to the Shift class. In this context. This attribute allows you to designate a separate class that contains additional metadata for an entity’s fields. Figure 6 shows the result of clicking the Shifts link. For example.NET Dynamic Data Site Visual Studio automatically generates entities from the tables that you chose. Currency and EmailAddress. The MetadataType attribute allows you to specify an entirely different class so that you can apply metadata to a field. the Shift class defined in HumanResources. The Display attribute specifies the text that should be displayed as the field’s column when the field is displayed in a list. I’ll change the names StartTime and EndTime to Start Time and End Time.cs. Now let’s take a look at the JobCandidates link.cs already has a field named StartTime. Although the Employee database table doesn’t have a field for an employee’s name. You can add similar code to change the appearance of the EndTime field. to name a few. Password. Another issue is that the Start Time and End Time values show both the date and the time. open the Metadata. respectively. the user really only needs to see the time. . NET Framework 4 recommend that you use the Display attribute instead of the DisplayName attribute from the .ascx. In Visual Studio. which is LoginID. Alternatively. There are two reasons for preferring Display instead of DisplayName.cs file and add the following class definition: [DisplayColumn("LoginID")] public partial class Employee { } public partial class EmployeeMetadata { [Editable(false)] public int ContactID { get. if (Mode == DataBoundControlMode.MinValue.cs file and modify the Employee class by applying the following MetadataType attribute: [DisplayColumn("LoginID")] [MetadataType(typeof(EmployeeMetadata))] public partial class Employee { } The Editable attribute specifies whether or not a field is editable in the UI.5. you can apply the UIHint attribute to a field to ensure that a different user control is used. } public override Control DataControl { get { return DateCalendar.NET Dynamic Data .ascx user control contains the UI for editing Boolean fields. while the Boolean_Edit. Visual Studio allows direct editing of the ContactID field. the Display attribute supports localization while the DisplayName attribute does not. we employee’s login. I’ll prevent editing of this field while still allowing it to be displayed for reference. out date). whether a field shows up as a filter. ASP. The DisplayColumn attribute specifies the name of the field from an entity that should be used to represent instances of that entity. DisplayName is obviously still in the framework. I’ll add metadata to the EmployeePayHistory entity to display the Rate field as Hourly Rate. To enforce relational integrity.ascx user control contains the UI for displaying Boolean fields. set. —Scott Hunter. add a new Dynamic Data Field item to the project and name it Date. the Display attribute allows you to control all kinds of things.NET Dynamic Data associates a field with a user control that has the same name as the field’s associated data type. Next. Open the Metadata.SelectedDate = DateCalendar. } } ASP .ToString().NET Framework 4. For example.ToolTip = Column. Visual Studio automatically adds Figure 9 Customized Employee Edit Form a second file named Date_Edit. Add the following class definitions to the Metadata. header and so on). FieldTemplateUserControl { protected void Page_Load(object sender.DynamicData.NET Routing allows an application to respond to URLs that do not physically exist.Attribute Changes in the . and following these practices will make sure code works with them. but we recommend against it whenever the Display attribute can be used. I’ll add a custom field template to display a Calendar control for editing of the Employee entity’s BirthDate field. By default. ASP. This field might provide more useful information to a user of this site. This folder contains user controls for editing fields of various data types. DateTime. Senior Program Manager Lead. we recommend that you replace DisplayName and ScaffoldColumn with the Display attribute.OnDataBinding(e).NET Figure 8 Custom Date_EditField Class public partial class Date_EditField : System. Second. which is part of the Person schema in the database.VisibleDate = date.Name] = ConvertEditedValue( DateCalendar. or whether the field is shown in the scaffold (AutoGenerate=false turns it off).Description. So. EventArgs e) { DateCalendar.Edit && FieldValue != null) { DateTime date = DateTime. } protected override void OnDataBinding(EventArgs e) { base. First. I’ll again modify the metadata code so that all Employee columns display the value of the LoginID field.NET Framework 3.ToShortDateString()). For example. Open the Metadata. the Boolean. Following these recommendations is best because other teams at Microsoft support the DataAnnotations namespace (WCF RIA Services). while the code shown in the examples in this article is completely valid. Figure 7 shows the new list with LoginIDs.Web. The ContactID field of the Employee entity actually refers to a row in the Contact table. Because I didn’t add any tables from the Person schema. DateCalendar.SelectedDate. as well as format the field’s value as currency. } } } Starting with the Microsoft . } } Customizing Templates Then define the EmployeeMetadata class as follows: 74 msdn magazine The Visual Studio project contains a folder named FieldTemplates. To do this.TryParse(FieldValue.ascx to the ASP . you can control the text for the various ways a field can be displayed (prompt. } } protected override void ExtractValues( IOrderedDictionary dictionary) { dictionary[Column. set.cs file: [MetadataType(typeof(EmployeePayHistoryMetadata))] public partial class EmployeePayHistory { } public partial class EmployeePayHistoryMetadata { [DisplayFormat(DataFormatString="{0:c}")] [Display(Name = "Hourly Rate")] public decimal Rate { get. You still need to use ScaffoldTable and the DisplayName attribute at the class level. OpenB f r( fe t ark Page Eleme lement:”. float y) { // Adds a link to the pageElements Font font = Font.pdf”).It sA d o ditab e ).Color) printJob.pdf”).GetTextWidth(text. 20). } private void AddImag d (overflow rma wFo w e private A d mag mage(Group pageElements. float x. Label label = ne Label(text. y.google.Conve iagno cs P o ess. x. 240. txt. path.Co or lements[2].Add(“Item 1”). Under on e h x G D d r(12). x + 303. 150. y 225. y.SetBounds(215. 5) n a e oid dCo mboBox(“cmbname”. x. x. x + 51.P Option . 99). y. y + 20. ts.A d(“Lime dSub S b tem Add e subUnorderedSubList2 = unorderedSubList.Printer. 720.FileName. b. UnorderedSubList subUnorderedSubList4 = unorderedSubList2.OK) D F ewer ew DynamicPDFViewerClass(). cb.SubPaths.Items. x + 20.*”. plotArea. float y) { AddCaptionAndRectangle(elements. y). } private void AddEAN W dth 2.Add(“ ( orderedSubLi ed e s Add n t s” AddC n s e ment y. cb1. Rg olor. x.Items. Tex TextField txt1 = new TextFi ld “txtf1name”. Unord dS Lis ubUnorde Sub s S subUnorderedSubList3. 12.ToolTip = “Multiline”. y + 21).Dia ostics.X + 0 a o t h ectangle(elements. “Unordered List Page Eleme :”.Elements[0].Green)).BorderColor = RgbColor.pdf”)). dd(“ Citrus”).Back elvetica b Fo Siz z cb. 12). 5 . aConverter.Filter = “ pen D i o l ecto :\ fileD lo ter “A = DialogResult.pdf”.Helve ica. “Line Page Element:”. 99).X += (204 . 204. 10). right indent tion.Elements.Add t E age. x + 112.S s s tput.40.pdf”). AddEAN13Sup5(p ge. font.Items.Add(“S ing Bean”). cb.Convert(GetDocPath(“DocumentA.*)|*. } if (pdfViewe P e ( dfV wer. 72 72.Items. label. } SaveFileDialog sa eDialog.Elements. 12. .Add(new LineSubPath(x + 215.GetSymbolWidth()) / 2. Webs e ). 20).A e en A k r ( x 0 pare ent g nts.docx”.Add (new Label(“This text is bookmarked. @”C:\Temp\yahoo. // Add the page to the document document.Path(x + 5. o true. Add(bar ev EAN8(Group elements.Filter = “Ad le “Open File g fi Adobe PDF files (*. y + 80)).PrintOptions. m era or ao or tting support for text th t appears in the document.yahoo. y + 21).X += (204 . “EAN e .Add(“Item 4”). 504 85).Items[“Non-Editable”].ShowDialog()==DialogResult * owDialog()==Dia a Di esul . y).pdf”). 48 et ptionAndRectangle(eleme a e ements. label. float x.Conversion.FontSize = 12. barCode.Add(formattedTextA ). AddCaptionAndRect gle(pageE e Form dTextA or r Html. txt1. AddUPCVersionA(page. x + 65.Sh wDialog() == DialogResult.5f. c H c cb nt 2 sA I t “Item ”) cb1. 204. 135).barCode.Items.DefaultValu = “TextField”. if (printJob.Combine(GetPath(). “Circle Page Element: x.DynamicP ion op n ns sionOptions(720. AsyncC p rt Converte o Converter_Converted). unorderedList = unordere List. UnorderedSubList unorderedSubList2 = ms.VAlign = VAlign.FileName). “Unordered List Page Element Ove flow:”. x. float y) { // Adds a circlt to the pageElements AddCaptionAndRectangle(pageElements.Eleme d ean”). x. st up pag pa ments. float y) { // Ad s a path to the pageElements ceTe. x + 5. float y) { // Adds an image to the pageElemen AddCaptionAndRe up g eme / n e h a ments dRectangle(pag (pageElements “Image ments nts. 80. c a cted .S t Add( urve Pa 0 w v u 6 ) AddC Capt A d(path). pageElements. x + 5. </font><b>bold. . txt1. t: : y).Items tems.GetSymbolWidth()) / 2. cb1. pageE 5 50f. ComboBox cb1 = new ComboBox( mb1name”.png”).Ang g eNumb ri gLabel pageNumLabel = new PageNumberingLabel(numberText. cb1. 405). As ncConverter aConve er = ne d(cb1). ceTe.AddUnorderedSubList(). Font. 120. // Create an overflow formatted t t area for t overflow text FormattedTextArea overflowForma edTextArea = form m t “Form dTextA flow 9 g emen Add ormat Add(fo att orm tArea) ea ed text a r the fl t o a ove Formatt TextArea t e + 284.Add(“Mango”). m ) v rter. parentOutline)). if (saveFileDialog. image. string numberText = “PageNumberingLabels contain page numbe ering e gLab E em t T t r mbe m e els contai mbe x + 5. y + 21). “C:\\temp\\output. PageNum eringLa 12. “EAN/JA 13 Bar Code”. “C:\\temp\\ l “C \ emp\\ ame”. txt1.Add(“Po b ( Potato”). cb1.*)|*.GetOverFlowList(x + 5. // Image is sized and cente d in the rectangle image. x. PDFPrinter printer = test. “TextFi n pageEl men x Field Form Page Element:”. ddU page.Ed [“Non-Editable”].Add(“Non-Editable”). 60). font. float x.24f).DynamicPDF.Add(“Banana”). Int GC andle c d Al c H ndleTyp d IntPtr contentsIntPtr =gch.SubLists. d e ddUPC CVers o ( nt ements.S w ” od r S bUno er dSubList4 r 2 ].Helvetica. x + 5.Elements. 0). aC arCode n C Ean1 S 2 5 0 e UPCVersionA(Grou elements.BorderColor = RgbColor. p geElemen 20) pa E ements.Elements. image. aConverter.Conversion.A nn ect pd Viewer.P P p ons o ollate . AddCaptionAndRectangle(page.Add(new CurveToSubPath(x + 108. file Dialog. 405).barCode.Convert(“http://www. 90. pieSeries da p Area rie a.Elements. x + 279.Add(unorderedList). elements. 99). if (fileDi log. y + 60)). b S st S bList.Eleme s.Add(new Line(x + 5. fileDialog.Convert(“http:// C ) o e e h p://www.Align = Align. } OpenFileDialog fileDialog = new Ope e se Me S w P t F D log e penFileDialog(). p e R c G p e n float at y) e dL d AndR a g g r List ge El e nt Overfl v // C Create an unordered list Unor e List unorderedList = new UnorderedList(x + 5.Add em ). barCode. fir line “ + “indenta efore g first i ntation. page.pdf”). o X ( float x.MaxLength = 9. 204. e UrlA lAction(“h n(“h ments.Add(“ Non-Citrus”).Elements. x.Alloc(contents. S s lements Add(27.bar A a UPCVersionASup2(Group elements. y.rtf”. RgbCo priva dA ink(Group p eElement float x.Add(“Item 2”). } private void AddUPCVers onASup5(Group elements. B rCode barCode = new UpcVe 9) Ba C Bar barC e pcVersionA(“12345678901”. 2 dd e s Lis nordered e ) A c n pag e Unordere r e m ve verflo AddTextField(Group pageElements. x.GetOverFlowList(x + 5. y + 40)). ” ms[“ [“Editable”]. } private void AddCo orm Page Elemen :”.Indent = 18.Converter. false) // Sets the indent prope fo att dTextArea.SubLists. You v complet o rol ver et ar ph properties: spacing befo e. d ) r es. BarCode barCode = new Ean13Sup2(“123456789012 2”.A abel).0 for ted e r p ge n float y) Ad for t d o o eEl r m edHtml “<p < b>P b tm. path. ConversionOptions options = new ConversionOptions(720. o mbna 15 B o Black.Items. 99). “Fo mattedT tArea Overflow Text:”. subUno o e d b s ubU ubList e b s A Ad “Strin ean” U (“Kidney Bean x += 279. “Output. float y) { AddCaptionAndRectangle(elements. y + 40. Ad CaptionAndRectangle(page. tems. AddUPCVersionESup2(page. “UPC Version E Bar Code.Filter = “Ado e PDF files (*.pdf) saveF al Adobe f)|*.Element 480.Add(“Swe t Po U d Swee Potato”). “Website A”).rtf”.InitialDirec ory = @”c:\”.Series.X += (204 . 20).Add(barCode). } else { MessageBox. fileDialog. “Website B”).Printer.Add(“Item 2”).Show(“Please open a file to print”).TimesRoman.Add(label). float y) { AddC tionAndRectangle(elements. 240. spacing after. “LetterPortrait. } p vate v d AddPath(Group pageElements.BackgroundColor = RgbColor. BarCode barCode = new Ean13Sup5(“12345678901212345”.Items.”.Collat = e o f r o D e Letter ob. AddUPCVersionE(pa .Se “ 1 Add(cb Converter. 12. AddUPCVersionASup2(page. subUnorderedSubList4. x. AddCaptionAn Rectangle(page. a ements. unorderedSubList2. Font mi Helvet e vetica e). y + 80. “EAN/JAN 8 Bar Code”. Link link = new Link(x + 5. pageElements.Too utogradient RgbColo e .ConvertHtmlString(sample 1 ” t 3 < < > < ></body></htm </h </html o ve n. </b <i>italic and </i><u>underline</u>. float y) { AddCaptionAndRectangle(elements.Collate) printJob.Color = autog dient3. 204.AliceBlue. “UPC Version A Bar Code”.P men s.Items. x + 175. cb.pdf”). if (printJob. 270).Add(new Bookmark(“Bookmarked Text”.Helve a.*”.Pa h(x n 20 hs. Un deredSu List sub n deredSub s u s A e edS bL t() u a r e S t2 m A ). cb. y + 20. 2 Group em floa e 204. s. y + 55.Backgroun xt t). t eTe.Editable = true.pn o .Converter. y + 40. } private void AddFor vate Ad ormattedTextArea(Group pageElements. 150. float y { // A ds a fo matted text area to thepageEle ments string formattedHt = “<p><i>Dynamic</i><b>PDF</b>&tm Generator v6. 0). lab Angle = 8.Color = true. roperty formatte e y P p n C t Rectangle(pageElements e(pa pa a ts eElements.Items. left indentati tation. float at ds s t l e o AndRecta pageEl men g ment:”.Add(21 “Website C”).Items[1]. U orderedSubList un ” ere d “Vege g e ” Un r eredS b ed unorderedSubList. cb1.Elements.pdf) F es f)|*.Convert(@ C:\ m mp ment m emp\OutputA. x + 5.OK) { pdfViewe penFile(fileDialog. } private vo .Ele lem s.DefaultValue = “This is a Scrollable TextField”.Add(image). 99). txt1. } private void AddLink x 2 3. x + 5. BarCode barCode = new UpcVer g u m e .Elements.Items. y + 40. float x. string samp eHtml = “<html><body><p>This is a very simple HTML string inc o e n t \ e g ampl H ml mp m h ml><body><p> p m ML r ble border=\”1\”><tr><td>100</td><td>200</td>” + “<td>300</td></tr><tr><td>400</td><td>500</td><td>600</td></tr></table></body></ht >”.Process Start(“Output. elements. float .GetSymbolWidth()) / 2 elements. x + 160.Add(overflowFormattedTextArea).OK) { pdfV ewer. txt1. s bUnor SubLis i m d B no deredSub s e S ] u A rde subUnorderedSubList.Pinned). y). and leading type. “ + “and 2 line propert f </fon <f t colo n olor l 00 > o ><b> </b>< < u > o rties: lea ng. @”C:\temp\OutputA. 110). y. alignment.Items. GCHand gch = GCHandle.Elements.Font = Font. y + 40.com”. TextAlign. y + 21). cb.Add(27. Add(“Fruits”). y). p pageElements. 215. 5 digit supplement”.Add(link).B rder de ont. @”C:\temp\OutputC.pdf contents ch. options). 215. T eading n e d Area = new FormattedT Area(formattedHtm x + 5.pdf”). n . y + 12 22 80.Elements.pdf”.DataLabel = da.AddrOfPinnedObject(). r pri P ntO n s. float x.Black.Add( o e nordered e er st t e norderedL x 20. unorderedList.rtf”). y. BarCode barCode = new Ean13(“123456789012”. aConverter. x 9 a d C 7 x. y e x 5 ne(x + 220.Converter.Path path = new ceTe. y. y + 20 s tex okmarked ”.PrintQuie () } byte[] cont t n Quiet().Add(“Beans”). 204.SaveAs(saveFileDia O fViewer Dialog.Add(b 2 5678 1212”. 80. subUnorderedSubList2. AddEAN8(page. 480.Elements AN Sup5(pa ents. cb. 220. 204. } private void AddRectangle(Group pageElements. unorderedSu ist. 480. 5 0 n Rgb or. AddCa ionA A e Pa x ) h.OpenFileForP inter m e r DFPrint inter n er pe orPrint n (fileDia (file alog. 21 1). “”).Op Sav F l a e Dialog”.ToolTip = “Editable Combo Box”. y + 20. x. n ge e DPDFLogo.C v C ampleHtml.Add(new CurveSubPath(x + 5. fl a x. unorderedList. x.FileName).com”.Backg ou olor R es(). (printJo ri C b. aConverter. y).C lor men d(21.DynamicPDF. pieSeries. “UPC Ver te o PCVersi o ement “UPC Version E Bar Code.Add(cb).TimesRoman.Add(n dd(n (x 2 arge)).Show log g ewer. y + 20 400. pageElements.DocumentName = “Lette Portrait”. UnorderedSubList subUnorderedSubList = unorderedSubList. x.pdf|All Files (*.Add(“It s e Ad ( Item able”) cb. 135).Paragraph. Y u have “ + “comple e contro over 8 paragraph p o or tha s t.Cen r. pageElements.Items. x.O { DynamicPDFView rClass test = new D es t. 0. 0. dd(barCode).SubLists. y + 20.Selected = true.Center. cb. </font><font color=’FF0000’>color. p Elements[0 me s[0 s[0]. string labelText = “Labels can be rotated”. flo x. n d e bLis un deredSub s d dSubList2. priva e void ath r m t float t Add h en h w PDF. Font.Convert(“C:\\temp\\Document2. 20). x. 0 t mesRoman.Center).Items.24 4f e n entered n ecta m B enter a n geElements. txt.Elements.Items A (“Item 1”).Pages.Items. FontFamily. 220. okma eElements.Helvetica. x.Items[ ”). 99).Black. 0. “Ordered L Page Element Overflow:”. fileDialog. 480.Ope uffer(contentsIntP . 12 . 04. float y) orderedList = orderedList. y. Path. 2 digit sup lement” x.Style. y + 80.AddUnorderedSubList().Items. options). BarCode barCode = new UpcVersionASup2(“1234567890112”. y + 20. 0. BarCode barCode = new Ean8(“12 p ddCaptionA Rec n ect c m N N d n fileDialog. float x. txt ackgroun C xt.TimesR b ab n b esRoman. by contents rvices.Items[0].Blu ). cb1.X += (204 . C goog c t ve er.B u r be F d derColo RgbC or. Sup2 oup .Add(pieSeries). 405).ConversionError += new ConversionErrorEventH r o n v tHandler(aConverter_ConversionError). s veFileDialog.Add g v eElements AddC tion n AddCap onAndRectangle(pageEle angle(pageElements. } privat /JAN d ode. 5 digit supplement”.Password = true. y + 2 n ( . p\\output. AddUPC sionASup5(page.Add(“Item 3”).Items.DynamicPDF. string text = “This is a link to Dyna T s k a t:”.Title = “Open File Dialog”. e dLin ne(Group n oat ) // s n he pageE ( e a e n s. y + 80. 3. pieSeries.Converter.B Field( x f1 me” t De u alue t od lor.FontSize = 12. y + 20). “EAN/JAN 13 Bar Code. . 9.pdf|All Files (*. 120. x + 5. x. float y) { AddCa A CaptionAndRectan angle(elements.Items[1]. </font><font pointSiz ’Times’ tf n> o Size=’6’>font “ + “size. y. x. printer. float x.font. floa y) { / Adds a line to the pag lements AddCaptionAndRectangle(pageElements. ceTe. 135). R bColor.PrintOptions.Underline = true. y + 50f. pageEl geElements.Selected = true.Title = “Open F Dialog”.”RgbC or. o t(@”C:\temp\DocumentC. AddCaptionAndRectangle(pageElements. } private void AddL ne(G p pageElements. ch. 20). em t o o . oa ( git supp up nt”. “Label & Page g L l PageNumberingL bel Page Elements:”.Add ew Circle(x + 112 s.AddUnorderedSubList().Convert(@”C:\temp\DocumentA. float y) { TextField txt = new TextField(“txtfna dTextF d me . 99). cb1. printJob. TextAlign 12 tAlign ign.P hDialog(). float x. y. barCode.Blac cb. 2 digit supplement”.pdf”). pieSeries.GetDescender(1 new Url new Lab t 5. pieSeries.Add(19. GCHand Typ e. o ) x t xtF “t fname”.barCode.Add(“Vegetables”). cb1 BorderColor u “Ed ab C ew omb B x(“cmb b m 303. y + 40.Items. AddUPCVersionESup5(page. x. cb.AliceBlue. barCode. cb1.B ckgroundColor = RgbColo B 1.Add(ba Code). subUnorderedSubList2.PageElements. float x. EAN/JAN 13 Bar C ar Code. cb1. y + 20). “Output. y + 20). Font. 60.Add(txt2).BorderColor = RgbColor.System. allo r n ntation i all ont face=’Times’>font face.PageElements. true). txt2.Blue . pageElements.SubPaths. respectively.aspx and ListDetails. while the Edit.NET Routing allows you to reuse page templates for CRUD operations. } folder.aspx page template renders a read-only view of an entity. } The DataType attribute provides automatic mapping by matching the data type name to the name of the user control. the DropDownList control contains all distinct foreign [UIHint("Date")] key values. ASP. First. you’ve seen that it’s entirely possible to create a fully operational Human Resources Web site in just a few minutes. I can accomplish this in one of two ways.ascx page with the following markup: edit and insert mode. Insert. The PageTemplates folder contains page templates that render the appropriate views for entities. He can be reached at msdnmag@bluevisionsoftware. True and False. I also modify the ExtractValues overridFor each list of entities.NET Dynamic Data to use the For Boolean filters.FieldTemplates folder.NET Dynamic Data provides out-of-the-box functionality that allows you to get a site up and running quickly.ascx. it uses the List. The Details. aspx page template allows you to view a list of entities as well as details for the selected entity on a single page. Alternatively. you must ensure that you add it to the route table in the Global. Default_Edit. For example.aspx page template displays an editable view of an entity. set.ascx field templates for the BirthDate three values: All.NET Dynamic Data should make your life a lot easier. ASP. To create a template that applies to only a specific entity.ascx.asax file.ascx and Default_Insert. the DropDownList control simply contains Date.ascx. ASP. it’s also entirely customizable. If you add a new one. data field being rendered. this folder contains three templates named Default. only three user controls exist: Boolean.aspx. Shifts_Edit. By default. For enumeration filters.NET Dynamic Data evaluates an {action} parameter as List. Ship It! Although this was a fictional scenario. Edit. The FieldValue field is inherited Shifts_Insert.NET Dynamic I override the OnDataBinding Data uses this user control to method to set the SelectedDate and render the read-only mode for VisibleDate properties of the Calendar a Shift entity (Shifts entity set). [DataType(DataType. For example.NET Dynamic Data uses a filter list. Details. If you change the birth date of the selected employee and click Update. respectively.ascx and Date_Edit. if ASP. The Insert.ascx. The List. You can alter the existing page templates or add a new one to the folder. 76 msdn magazine You can apply the UIHint attribute to a field to ensure that a different user control is used. ASP.NET Dynamic Data uses the entity’s den method to save any changes to the SelectedDate property to a foreign key fields. However. I then proceeded to enhance the UI by adding metadata and a custom field template. 2003). ASP.aspx page template to display a list of entities. The filters are added as DropDown controls to the list the values in this dictionary to update the underlying data source.ascx and ForeignKey. JAMES HENRY is an independent software developer for BlueVision LLC. Figure 9 shows the result of editing an employee.NET” (Blue Vision. and it represents the value of the and insert modes of the Shift entity.ascx and Figure 10 Including Data Filters on the Page field. 2005) and “Developing . allowing it to meet the needs of individual developers and organizations.Date)] public DateTime BirthDate { get. if you add a new file with the complete class definition user control named Shifts. He’s the author of “Developing Business Intelligence Solutions Using Information Bridge and Visual Studio .NET Dynamic Data support for ASP. The UIHint attribute gives you greater control in situations where the field type doesn’t match the name of the user control. Next. THANKS to the following technical expert for reviewing this article: Scott Hunter ASP .aspx. I’ll first replace the content of the Date_Edit. By default.cs set. The EntityTemplates folder contains templates for displaying entity instances in read-only. set. five page templates are supported: List.NET Dynamic Data automatically routes URL requests to the appropriate page by examining the value of the {action} parameter defined for the route.aspx page renders an editable view with default field values. to the folder. edit and insert modes. I can apply the DateType attribute as follows: Enumeration. The filters are defined as user controls in the Filters public DateTime BirthDate { get. a company that specializes in Microsoft technology consulting. the Dropfield. For foreign key the UIHint attribute as follows: filters. I need to inform ASP. ASP. I can apply DownList control contains all enumerated values.aspx. then ASP.ascx. Boolean fields and enumeration fields to build dictionary of field name-value pairs. By default.ascx shown in Figure 8. as shown in Figure 10.NET Dynamic Data . As I mentioned earlier in the article.aspx. The ListDetails.com. If you’ve become frustrated with having to continually perform the tedious tasks of implementing CRUD pages on every Web application project. page. which display entity instances in read-only.aspx page template renders a UI for displaying entities as tabular data.ascx render the edit from FieldTemplateUserControl. control to the value of the FieldValue Similarly. the new data will be saved to the database. simply add a new user control to the folder <asp:Calendar ID="DateCalendar" runat="server"></asp:Calendar> and give it the name of the entity I’ll then modify the Date_Edit.NET Custom Controls and Designers Using C#” (Blue Vision. . Variability and Pros and Cons In the commonality/variability analysis.NET Framework from the C++ world know one solution to this problem—that of parameterized types. To recap. Part 5: Automatic Metaprogramming In last month’s piece. That solution was to simply write each needed list implementation as necessary. Commonality. as also discussed. Although not complicated. sometimes the list wanted to be in a linked list of nodes. the language being generated doesn’t have to be the language in which the code generator is written—in fact. Figure 1 An Example of Writing List Implementations as Necessary Class ListOfInt32 Class Node Public Sub New(ByVal dt As Int32) data = dt End Sub Public data As Int32 Public nextNode As Node = Nothing End Class Private head As Node = Nothing Public Sub Insert(ByVal newParam As Int32) If IsNothing(head) Then head = New Node(newParam) Else Dim current As Node = head While (Not IsNothing(current. it places structure and behavior (the outline of the class above) into commonality. obviously. Of course. as shown in Figure 2. also known more informally as generics. it certainly stands at the center of the OO paradigm. Both are also metaprogrammatic languages—each offers the Microsoft . While inheritance isn’t the only form of commonality/variability available within a modern objectoriented (OO) language such as C# or Visual Basic.nextNode)) current = current. automatic metaprogramming occupies an interesting place. another solution emerged long before templates (which did.nextNode) And counter < index) current = current. at least not in a way that we find satisfying. automatic metaprogramming.” which will clearly be a problem as time progresses. we wanted an ordered linked list. such that we could insert items into the list in a particular slot and see the items in that exact order. And. objects came under the microscope. but strongly typed to the type being stored within it. allowing for variability along data/type lines. what we’ve discovered so far is that C# and Visual Basic are both procedural and OO languages—but clearly the story doesn’t stop there. In the Figure 2 example.nextNode counter = counter + 1 End While If (IsNothing(current)) Then Throw New Exception("Bad index") Else Retrieve = current. make it into the Java platform). once the class in question has been created. we can swap in any type desired into the ListOf type. Another program can easily do it. this fails the Don’t Repeat Yourself (DRY) test— every time the design calls for a new list of this kind. as developers who came to . as shown in Figure 1. reflective and generative. For performance reasons.NET. particularly if more features become necessary or desirable. and in particular we looked at the “axis” of commonality/variability analysis that inheritance offers us.NET Framework developer the opportunity to build programs out of programs. that of the type being stored in the generated class. in a variety of different ways: automatic. Then. it will need to be written “by hand. it’s still going to be awkward and time-consuming to write each of these. 78 msdn magazine . Developers who came to the .nextNode End While current. such as a program designed to kick out classes that are customized to each type needed. it will often help immensely if it isn’t. eventually. Of course. For example. because then it will be easier to keep the two more clearly distinct in the developer’s head during debugging. nobody ever said developers had to be the ones writing such code.data End If End Function End Class Automatic Metaprogramming The core idea behind metaprogramming is simple: the traditional constructs of procedural or OO programming haven’t solved quite all of our software design issues. Clearly. Which brings us around neatly to the solution of code generation. or as it’s sometimes called. it needs only to be compiled and either added to the project or else compiled into its own assembly for reuse as a binary.nextNode = New Node(newParam) End If End Sub Public Function Retrieve(ByVal index As Int32) Dim current As Node = head Dim counter = 0 While (Not IsNothing(current. it doesn’t always provide the best solution to all problems.NET through Java’s early days know. to cite a basic flaw.THE WORKING PROGRAMMER TED NEWARD Multiparadigmatic . Now. But. developers frequently found a need for a data structure that maintained an ordered list of some particular type. In other words. charting and printing to your Windows Forms applications with the easy to use WorkbookView control.NET Excel Reporting Easily create richly formatted Excel reports without Excel using the new generation of spreadsheet technology built from the ground up for scalability and reliability. Excel Compatible Windows Forms Control Add powerful Excel compatible viewing. MSN Money Program Manager ASP.” Chris Donohue.com. Create Dashboards from Excel Charts and Ranges You and your users can design dashboards. we ultimately chose SpreadsheetGear for . Toll Free USA (888) 774-3273 | Phone (913) 390-4797 | [email protected]. reports. calculating. and models in Excel rather than hard to learn developer tools and you can easily deploy them with one line of code. Download the FREE fully functional 30-Day evaluation of SpreadsheetGear 2010 today at www.com . formatting.NET because it is the best fit for MSN Money. editing. Excel Services.Microsoft Chose SpreadsheetGear. charts. “After carefully evaluating SpreadsheetGear.. and other 3rd party options. (Using macros to create “tiny templates” was common before C++ got templates. the commonality is in the structure and behavior. using metaprogramming as part of a larger framework or library is common. and this is something that’s hard to track and ensure automatically. Using a rich templating language. lest the source code template grow out of control trying to be too flexible. 2010). and extension methods carry some restrictions that prevent them from replacing existing behavior—again leaving neither approach as a good mechanism to carry negative variability. but that doesn’t do anything for the code already generated—each of those source artifacts needs to be regenerated. fixing it isn’t a simple matter. not by a long shot. too. and vary by structural and behavioral lines. in the form of the “project templates” and “item templates” that most of us use to create new projects or add files to projects.ToString() Dim template As String = "Class ListOf{0}" + CRLF + " Class Node" + CRLF + " Public Sub New(ByVal dt As {0})" + CRLF + " data = dt" + CRLF + " End Sub" + CRLF + " Public data As {0}" + CRLF + " Public nextNode As Node = Nothing" + CRLF + " End Class" + CRLF + " Private head As Node = Nothing" + CRLF + " Public Sub Insert(ByVal newParam As {0})" + CRLF + " If IsNothing(head) Then" + CRLF + " head = New Node(newParam)" + CRLF + " Else" + CRLF + " Dim current As Node = head" + CRLF + " While (Not IsNothing(current. it’s even possible to eliminate commonality altogether and vary everything (data.Type = System. any handmade changes to those generated files will be intrinsically lost. He also consults and mentors regularly. C# has a macro preprocessor.) On top of that.Name) End Using End If Code Generation Code generation—or automatic metaprogramming—is a technique that’s been a part of programming for many years. However. if necessary. that checking is done by the code generator itself. This risk of overwrite can be mitigated by use of partial classes. He’s written more than 100 articles. Doing so typically becomes unmanageable quite quickly. and in general should be avoided.vb". Figure 2 An Example of Automatic Metaprogramming Sub Main(ByVal args As String()) Dim CRLF As String = Chr(13).nextNode = New Node(newParam)" + CRLF + " End If" + CRLF + " End Sub" + CRLF + " Public Function Retrieve(ByVal index As Int32)" + CRLF + " Dim current As Node = head" + CRLF + " Dim counter = 0" + CRLF + " While (Not IsNothing(current. Obviously. automatic metaprogramming remains one of the more widely used forms of metaprogramming. its principal flaws are the lack of compiler structure and checking during the expansion process (unless. the code generation templates can do source-time decision making.com and read his blog at blogs.NET Framework and Java platform systems.nextNode" + CRLF + " counter = counter + 1" + CRLF + " End While" + CRLF + " If (IsNothing(current)) Then" + CRLF + " Throw New Exception()" + CRLF + " Else" + CRLF + " Retrieve = current.Create)) out. unless the template-generated code has allowed for customizations.ToString + Chr(10).tedneward. automatic metaprogramming remains a useful and handy tool to have in the designer toolbox.nextNode" + CRLF + " End While" + CRLF + " current. given the ListOf example in Figure 2. and the variability is in the data type being stored—any attempt to introduce variability in the structure or behavior should be considered to be a red flag and a potential slippery slope to chaos. That leads to one of the key realizations about automatic metaprogramming: Because it lacks any sort of inherent structural restrictions.Reflection.Length = 0 Then Console. which then allows the template to provide commonality along data/ structural lines. But partial classes must be in place from the beginning within the templates.nextNode) And counter < index)"+ CRLF + " current = current.WriteLine(" where <listType> is a fully-qualified CLR typename") Else Console. in turn.Name + ". Fortunately.Load("mscorlib"). structure. For example. Like so many other things in computer science. ranging all the way from the C preprocessor macro through to the C# T4 engine. particularly around areas of maintenance: Should a bug be discovered (such as the concurrency one in the ListOf… example in Figure 2). And so on. and by extension methods. arguably every Visual Studio project begins with automatic metaprogramming. those mechanisms were introduced to save other Microsoft developers some grief—but they won’t eliminate all the potential pitfalls in code generation.data" + CRLF + " End If" + CRLF + " End Sub" + CRLF + "End Class" If args. In fact. and has authored and coauthored a dozen books.NET MVC). what’s more.WriteLine("Producing ListOf" + args(0)) Dim outType As System.Assembly. allowing developers to fill in the “other half ” (or not) of the class being generated. The . In fact. code generation carries with it some significant risks. as do C++ and C.0” (Wrox. TED NEWARD is a principal with Neward & Associates.WriteLine(template. Reach him at ted@tedneward. if the code template is sufficiently complex (and this isn’t necessarily a good angle to pursue). giving developers an opportunity to “add” methods to an existing family of types without having to edit the types. including “Professional F# 2. Other toolkits use automatic metaprogramming to provide “scaffolding” to ease the early stages of an application (such as what we see in ASP. 80 msdn magazine particularly for inter-process communication scenarios (such as the client and server stubs generated by Windows Communication Foundation). behavior and so on). and will likely continue to carry forward. choose the commonality and variability explicitly.WriteLine("Usage: VBAuto <listType>") Console.But automatic metaprogramming can reverse that. however. FileMode. despite its obvious flaws and pitfalls.NET Framework offers some mechanisms to make code generation easier—in many cases. it’s far from the only meta-tool in the programmer’s toolbox.com. an independent firm specializing in enterprise . is a C# MVP and INETA speaker. such as the Text Template Transformation Toolkit (T4) that ships with Visual Studio. a task that’s harder than it sounds) and the inability to capture negative variability in any meaningful way. The template can obviously be fixed.nextNode))" + CRLF + " current = current. The Working Programmer . outType. of course.GetType(args(0)) Using out As New StreamWriter(New FileStream("ListOf" + outType. owing to the conceptual simplicity of the idea. And yet. And. . UI FRONTIERS CHARLES PETZOLD A Color Scroll for XNA One of the first Windows programs I wrote for publication was called COLORSCR (“color scroll”), and it appeared in the May 1987 issue of Microsoft Systems Journal, the predecessor to this magazine. Over the years, I’ve frequently found it instructive to rewrite this program for other APIs and frameworks. Although the program is simple—you manipulate three scrollbars or sliders for red, green and blue values to create a custom color—it involves crucial tasks such as layout and event handling. In functionality, as well, the program is not strictly a simple, meaningless demo. If you ever found yourself creating a color-selection dialog, this program would be a good place to start. In this article, I want to show you how to write a color-scroll program using the XNA Framework as implemented in Windows Phone 7. The result is shown in Figure 1. XNA is Microsoft .NET Framework-based managed code for games programming, but if you start searching through the Microsoft.Xna.Framework namespaces, you won’t find any scrollbars or sliders or even buttons among the classes! XNA is simply not that kind of framework. If you want a slider in your program, it must be specially created for that job. Showing you how to write an XNA color-scroll program isn’t just an academic exercise, however. My intention is actually much broader. As you know, Windows Phone 7 supports both Silverlight and XNA, and generally for any particular application the choice between the two is obvious. However, you might find that the graphical demands of your application are simply too much for Silverlight, but at the same time you might be hesitant about using XNA instead because it’s missing familiar amenities, such as buttons and sliders. My response is: Do not be afraid. Writing simple controls in XNA is easier than you probably think. You can also consider this article as an introduction to XNA programming for Windows Presentation Foundation (WPF) and Silverlight programmers. I found myself applying certain concepts from WPF and Silverlight to this job, so the XNA color-scroll program is a celebration of synergy as well. All the downloadable source code is in one Visual Studio solution called XnaColorScroll. This solution includes an XnaColorScroll project for Windows Phone 7 and a library called Petzold.Xna.Controls containing the four XNA “controls” that I created for this program. (Obviously, loading the solution into Visual Studio requires that you have the Windows Phone Development Tools installed.) Feel free to steal these controls and supplement them with your own for your own XNA controls library. 82 msdn magazine Figure 1 The Xnacolorscroll Program Games and Components Just as the main class of a WPF, Silverlight or Windows Forms program is a class that derives from Window, Page or Form, the main class of an XNA program derives from Game. An XNA program uses this Game derivative to draw bitmaps, text and 3D graphics on the screen. To help you organize your program into discrete functional entities, XNA supports the concept of a component, which is perhaps the closest that XNA comes to a control. Components come in two varieties: GameComponent derivatives and DrawableGameComponent derivatives. DrawableGameComponent itself derives from GameComponent, and the names suggest the difference. I’ll be using both base classes in this exercise. The Game class defines a property named Components of type GameComponentCollection. Any components that a game creates must be added to this collection. Doing so ensures that the component gets the same calls to various overridden methods that the game itself gets. A DrawableGameComponent draws on top of whatever the Game draws. Let’s begin the analysis of the XnaColorScroll program—not with the Game derivative, but with the GameComponent derivatives. Figure 2 shows the source code for a component I call TextBlock, and it’s used in much the same way as a WPF or Silverlight TextBlock. Notice the public properties following the constructor in Figure 2. These indicate the text itself, and the font, color and position of the text relative to the screen. Two additional properties, HorizontalAlignment and VerticalAlignment, allow that position to reference Code download available at code.msdn.microsoft.com/mag201101UIFrontiers. the side or center of the text string. In XnaColorScroll, these properties are handy for centering the text relative to the sliders. The class also includes a get-only Size property, which returns the size of the rendered text with that particular font. (The XNA Vector2 structure is often used for representing two-dimensional positions and sizes as well as vectors.) The TextBlock component overrides both the Update and Draw methods, which is normal for a drawable component and normal for Game derivatives as well. These two methods constitute the program’s “game loop,” and they’re called at the same rate as the video display—30 times per second for Windows Phone 7. You can think of the Draw method like a WM_PAINT message or an OnPaint override, except that it’s called for every screen refresh. The Update method is called before each Draw call and is intended to be used for performing all the necessary calculations and preparation for the Draw method. TextBlock as written has lots of room for performance improvements. For example, it could cache the value of the Size property or only recalculate the textOrigin field whenever any property that contributes to it changes. These enhancements might take priority if you ever discover that the program has become sluggish because all the Update and Draw calls in the program are taking longer than 0.03 seconds to execute. Figure 2 The TextBlock Game Component public class TextBlock : DrawableGameComponent { SpriteBatch spriteBatch; Vector2 textOrigin; public TextBlock(Game game) : base (game) { // Initialize properties this.Color = Color.Black; } public public public public public public string Text { set; get; } SpriteFont Font { set; get; } Color Color { set; get; } Vector2 Position { set; get; } HorizontalAlignment HorizontalAlignment { set; get; } VerticalAlignment VerticalAlignment { set; get; } public Vector2 Size { get { if (String.IsNullOrEmpty(this.Text) || this.Font == null) return Vector2.Zero; return this.Font.MeasureString(this.Text); } } protected override void LoadContent() { spriteBatch = new SpriteBatch(this.GraphicsDevice); base.LoadContent(); } public override void Update(GameTime gameTime) { if (!String.IsNullOrEmpty(Text) && Font != null) { Vector2 textSize = this.Size; float xOrigin = (int)this.HorizontalAlignment * textSize.X / 2; float yOrigin = (int)this.VerticalAlignment * textSize.Y / 2; textOrigin = new Vector2((int)xOrigin, (int)yOrigin); } base.Update(gameTime); } public override void Draw(GameTime gameTime) { if (!String.IsNullOrEmpty(Text) && Font != null) { spriteBatch.Begin(); spriteBatch.DrawString(this.Font, this.Text, this.Position, this.Color, 0, textOrigin, 1, SpriteEffects.None, 0); spriteBatch.End(); } base.Draw(gameTime); } } If you want a slider in your program, it must be specially created for that job. As you can see in Figure 1 , XnaColorScroll has six TextBlock components but also three sliders, which are probably going to be a little more challenging. Very many years ago, I might have tried coding the entire slider in a single class. With my experience in WPF and Silverlight control templates, however, I realize that the Slider control might best be constructed by assembling a Thumb control and RepeatButton controls. It seems reasonable to try to create an XNA Slider component from similar child components. A Game class adds child components to itself by adding them to its Components property of type GameComponentCollection. The big question now becomes: Does GameComponent itself have a Components property for its own child components? Unfortunately, the answer is no. However, a component has access to its parent Game class, and if the component needs to instantiate its own child components, it can add those additional components to the same Components collection of the Game class. Do this in the right order and you won’t even need to mess around with the DrawOrder property—defined by DrawableGameComponent—that can help with the order in which components are rendered. By default, the Draw method in the Game class is called first, then all the Draw methods in the components as they appear in the Components collection. So let’s begin. Like TextBlock, the Thumb class derives from DrawableGameComponent. The entire Thumb class except for the msdnmagazine.com Components and Child Components touch processing is shown in Figure 3. Thumb defines the same three events as the WPF and Silverlight Thumb controls. In the TextBlock component in Figure 2, the Position property is a point of type Vector2; in Thumb, the Position property is a Rectangle object, and defines not only the location of the Thumb, but also its rectangular size. The Thumb visuals consist entirely of a white 1 x 1 pixel bitmap that’s displayed at the location and size indicated by the Position property. (Using single-pixel bitmaps stretched to a particular size is one of my favorite XNA programming techniques.) The Thumb processes touch input but doesn’t move itself. The Slider component is responsible for handling DragDelta events from the Thumb and setting a new Position property. January 2011 83 Besides the Thumb, my Slider also contains two RepeatButton components, one on each side of the Thumb. Like the WPF and Silverlight RepeatButton controls, these components generate a series of Click events if you hold your finger on them. In my XNA Slider, the RepeatButton components occupy a particular area of the screen but are actually invisible. (The thin vertical track down the center is drawn by Slider itself.) This means that RepeatButton can derive from GameComponent rather than DrawableGameComponent. Figure 4 shows the whole RepeatButton control except for the touch processing. At this point, we’re ready to build the Slider component. To keep it reasonably simple, I restricted myself to a vertical orientation. Slider has the usual public properties—Minimum, Maximum and Value, among others—and also defines a ValueChanged event. In its Initialize override, Slider creates the three child components (and saves them as fields), sets the event handlers and adds them to the Components collection of its parent Game class accessible through its Game property: public override void Initialize() { thumb = new Thumb(this.Game); thumb.DragDelta += OnThumbDragDelta; this.Game.Components.Add(thumb); btnDecrease = new RepeatButton(this.Game); btnDecrease.Click += OnRepeatButtonClick; this.Game.Components.Add(btnDecrease); btnIncrease = new RepeatButton(this.Game); btnIncrease.Click += OnRepeatButtonClick; this.Game.Components.Add(btnIncrease); base.Initialize(); } method provides low-level touch input that consists of TouchLocation objects indicating Pressed, Moved and Released activity with position information and integer ID numbers to distinguish between multiple fingers. The TouchPanel.ReadGesture method (which I don’t use in this program) provides higher-level gesture support. When I first started working with game components, I thought that each component could obtain touch input on its own, take what it needed and ignore the rest. This didn’t seem to work very well; it was as if the game class and the components were all competing for touch input rather than sharing it like polite children. Writing simple controls in XNA is easier than you probably think. I decided to work out another system for processing touch input. Only the Game class calls TouchPanel.GetState. Then, each TouchLocation object is passed to child components through a method I call ProcessTouch. This allows the components to get first dibs on the input. A component that makes use of a TouchLocation object returns true from the ProcessTouch method, and further processing for that particular TouchLocation object ends. (Returning true from ProcessTouch is like setting the Handled property of RoutedEventArgs to true in WPF or Silverlight.) Figure 3 Most of the Thumb Class public class Thumb : DrawableGameComponent { SpriteBatch spriteBatch; Texture2D tinyTexture; int? touchId; public event EventHandler<DragEventArgs> DragStarted; public event EventHandler<DragEventArgs> DragDelta; public event EventHandler<DragEventArgs> DragCompleted; public Thumb(Game game) : base(game) { // Initialize properties this.Color = Color.White; } public Rectangle Position { set; get; } public Color Color { set; get; } protected override void LoadContent() { spriteBatch = new SpriteBatch(this.GraphicsDevice); tinyTexture = new Texture2D(this.GraphicsDevice, 1, 1); tinyTexture.SetData<Color>(new Color[] { Color.White }); base.LoadContent(); } ... public override void Draw(GameTime gameTime) { spriteBatch.Begin(); spriteBatch.Draw(tinyTexture, this.Position, this.Color); spriteBatch.End(); base.Draw(gameTime); } } XNA Game derivatives and GameComponent derivatives have three occasions to perform initialization—the class constructor, an override of the Initialize method and an override of the LoadContent method—and it might be a little confusing when to use which. As the name suggests, LoadContent is great for loading content (such as fonts or bitmaps) and should be used for all other initialization that depends on the existence of the GraphicsDevice object. I prefer creating components and adding them to the Components collection during the Initialize override, as shown in the previous code snippet. In XnaColorScroll, the Game derivative class also creates six TextBlock components and three Slider components in its Initialize override. When the Initialize method in the base class is called at the end, then all the Initialize methods for the child components are called. The Initialize method in each Slider then creates the Thumb and two RepeatButton components. This scheme ensures that all these components are added to the Components collection in a proper order for rendering. Processing Touch Input The primary means of user input in most Windows Phone 7 programs is multi-touch, and XNA is no exception. (Keyboard input is also available to XNA programs, and programs can also use the phone’s accelerometer as an input device.) An XNA program obtains touch input during the Update override. There are no touch events. All touch input is polled and is accessible through the TouchPanel class. The static TouchPanel.GetState 84 msdn magazine UI Frontiers COM/LV PRODUCED BY: . Data Management to Visual Studio 2010. Attend Visual Studio Live! for real-world education for creators like you.VSLIVE. You turn the “impossible” into “possible. Insightful keynotes from Microsoft MVPs. Practical how-to sessions.YOU ARE NOT A CODER: YOU ARE A CREATOR. YOU TAKE AN IDEA AND MAKE IT A REALITY. APRIL 18–22.” You bring into being the applications that your company relies on everyday. you’ll get the training that will take your code to the next level. NEVADA RIO ALL-SUITE HOTEL & CASINO REGISTER TODAY USE CODE ADJAN SAVE $300! SUPPORTED BY: WWW. 2011 LAS VEGAS. From Silverlight to Sharepoint. Full-day intensive workshops. it fires Click events. But with a little judicious use of components (including components built from other components) and mimicking routed event handling.GetState(). The ProcessTouch method in each Slider control then calls the ProcessTouch methods in the two RepeatButton components and the Thumb component: public bool ProcessTouch(GameTime gameTime. It illustrates once again how high-level interfaces such as Silverlight may make our lives much simpler. public event EventHandler Click. } Notice how further processing of each TouchLocation object stops whenever ProcessTouch returns true. primary++) if (sliders[primary]. return false.GetState to get all the touch input and then calls the ProcessTouch method in each Slider component as shown here: TouchCollection touches = TouchPanel. I find that difference very interesting. } If any of these components (or the Game class itself) wished to perform its own touch processing. which is often exactly what you want. if you deploy the XnaColorScroll program to a phone. When the Thumb component detects finger movement on its surface. touch)) return true.In XnaColorScroll. The primary means of user input in most Windows Phone 7 programs is multitouch.Value. Any TouchLocation object that remained unprocessed by the children could then be examined for the class’s own use. touch)) return true. sliders[1]. 2010). the Update method in the main Game class calls TouchPanel. The Game class handles the ValueChanged events from the three Slider components in a single event handler.Text = sliders[2]. However.ProcessTouch(gameTime.Value. when the RepeatButton component detects taps or sustained presses.Value. static readonly TimeSpan REPEAT_TIME = TimeSpan. All other touch input with that ID belongs to that particular component until the finger is released.ToString("X2"). Both jobs involve a 1 x 1 pixel bitmap stretched to fill half the screen. Propagating Events The touch processing of the Thumb and RepeatButton components is really the crux of the Slider control. } Figure 4 Most of the RepeatButton Class public class RepeatButton : GameComponent { static readonly TimeSpan REPEAT_DELAY = TimeSpan. I believe the concept is called TANSTAAFL: There ain’t no such thing as a free lunch. public RepeatButton(Game game) : base(game) { } public Rectangle Position { set.Value). so the method simply sets new values of the three TextBlock components and a new color: void OnSliderValueChanged(object sender.FromMilliseconds(100). and draw the rectangle with a new selectedColor on the right. foreach (TouchLocation touch in touches) { for (int primary = 0. sliders[2].Value. txtblkValues[1]. selectedColor = new Color(sliders[0]. touch)) return true.Value. The Slider is also responsible 86 msdn magazine THANKS to the following technical expert for reviewing this article: Shawn Hargreaves UI Frontiers . and XNA is no exception.ToString("X2"). it would first call ProcessTouch in each child. Better than Silverlight? I recently also wrote a Silverlight version of the color-scroll program for Windows Phone 7. That leaves the Draw override in the Game class with just two jobs: Draw the black rectangle that serves as background behind the components. His new book. primary < 3.ProcessTouch(gameTime.Text = sliders[1]. bool delayExceeded. but that’s something you can study on your own if you’re interested. even XNA can be tamed and made to seem more like Silverlight—perhaps even better. “Programming Windows Phone 7” (Microsoft Press. which adjusts its Value property accordingly. Both these events are handled by the parent Slider component. } . it generates DragDelta events. get. txtblkValues[2]. TimeSpan clickTime. and I discovered that the program only allowed me to manipulate one Slider at a time. this scheme allows the visually topmost children to have first access to touch input. TouchLocation touch) { if (btnIncrease. touch)) break..ToString("X2").ProcessTouch(gameTime.ly/cpebookpdf. but often something else needs to be sacrificed. and this fires a ValueChanged event from the Slider.ProcessTouch(gameTime. is available as a free download at bit. XNA is so low-level in some respects that it might remind us of the Wild West. Both the RepeatButton and the Thumb are interested in touch input if a finger is first touched within the area indicated by the component’s Position rectangle.. It’s much like the routed event handling implemented in WPF and Silverlight. } for setting the Position properties of the Thumb and two RepeatButton components based on the new Value. EventArgs args) { txtblkValues[0]. you can manipulate all three Slider controls independently. The component then “captures” that finger by saving the touch ID. if (btnDecrease. if (thumb. C HARLES P ETZOLD is a longtime contributing editor to MSDN Magazine.Text = sliders[0].FromMilliseconds(500). int? touchId. In effect. . My friends. A time to read brilliant columns. Microsoft named him a Software Legend in 2002. A time for bullet points and a time to ditch PowerPoint and actually talk to your audience. and a time to make them stateless.ly/ g8Q5I6). a time to weep. Andy Rooney extended this pattern in his book. Dennis Miller and George Will. That’s now. unless that speaker is me). when he suggested changes to the list: “There is a time for putting chocolate sauce on vanilla ice cream. Their authorship is popularly attributed to Shlomo ben David. A time to put your user first and not stroke your own ego.” The words. A time to play Solitaire at meetings. and a time to go home and appreciate still having a family. a time to die. 2006) and “Introducing Microsoft . a time to kill. Or maybe it’s better if you don’t. and a time to quit goofing off and get back to work already. A time to chase the cat off your laptop keyboard. A time to work until two in the morning.NET at Harvard University Extension School and at companies all over the world.DON’T GET ME STARTED DAVID PLATT Turn! Turn! Turn! I was listening to Pandora. Pete Seeger put the words to music in 1959. which shall be as the wintertime comes to an end. if you’re going to get picky). DAVID S. 2002). rearranging some to make the song more singable (“poetic license. and a time to just ship the damn thing. there is: A time to slam down a huge jolt of caffeine. I hope you’ll use the e-mail link below to send me your own favorites. He’s the author of 11 programming books.com. and a time for every purpose under heaven. “The Most of Andy Rooney” (Galahad Books.” another triumph of usability over strict veracity) and adding six words of his own. A time to save state in object instances. I’ve sometimes described my goals for this column as: “Think of it as a cross between Andy Rooney. A time to be born. … A time to play basketball. there is a time for A and a time for !A (or ~A. 1991).” 88 msdn magazine . A time to hold up delivery until you can fix a particular feature. They all describe opposite actions and state that a time exists for each. and its algorithms decided to play me the Byrds classic (1965) folk-rock song “Turn! Turn! Turn!” I know you’ll recognize some of its lyrics: “To everything. a time to laugh. and not in June. You can contact him at rollthunder. specifically the third chapter. a time to heal. of course. PLATT teaches Programming . there is a season. which shall be in the wintertime. and a time to knock it off with the basketball. come from the biblical book of Ecclesiastes. Geek that I am. He wonders whether he should tape down two of his daughter’s fingers so she learns how to count in octal. A time to listen to Clippy (“I see you’re writing a crude forgery”) and a time to shoot him in the head. I couldn’t help but realize the pattern that unites the verses of this song.com the other day. a time to plant.” I’ve blatantly plagiarized Sol’s and Pete’s and Andy’s idea and made the following observations about the contrasting times in a geek’s life. We both graduated from Colgate University. Solomon the Wise (although see the debate at bit. there is a time for A and a time for !A (or ~A. and a time to use your desktop and let her sleep. There isn’t an opposite to this one. A time for bullet-proofing your code and a time for taking short cuts. 1 hit (though I suspect the copyright has expired. To put it into canonical geek language. an ancient Israelite king would own credit for the lyrics to a Billboard No. … A time for exercise and a time for lying down and taking a little nap. If true. if you’re going to get picky). he in 1942 and I in 1979. Rooney is one of my longtime influences. and a time for eating vanilla ice cream without chocolate sauce. including “Why Software Sucks” (Addison-Wesley Professional. To put it into canonical geek language. and a time to sip it slowly over the course of an hour.NET” (Microsoft Press. and a time to pay attention to the speaker (rare. a time to chop out the entire feature because you can’t fix it and need to make the ship date. so he wouldn’t get much money for it). a time to reap.
Copyright © 2024 DOKUMEN.SITE Inc.