June 20, 2014

Continious Integration (CI) using TFS, Visual Studio and NuGet with Artifactory

   Part 2 - “Is the sky cracking”?


In the first part of this blog, we covered the downloading of the sample projects, the installation of Artifactory PRO, the building of the MyLogger library, and its storage into Artifactory. In the previous blog we covered the ability of Artifactory to proxy build dependencies from a remote cloud repository and how to setup of the Artifactory. 

This second part will cover the consumption of the MyLogger library by three projects found within one Visual Studio solution. This multi-example project builds the api, core, and service libraries using the latest mylogger.dll from the CI build as well as requiring other remote dependencies that will be cached by Artifactory. All build artifacts will be hosted in Artifactory. How to promote a build artifact from an Artifactory staging repository to a release repository will also be covered. The final part of this blog will show how one may connect the artifacts stored in Artifactory back to Visual Studio and the TFS database.

All source code needed should already be available from the Part 1 of this blog. Artifactory Pro setup with the appropriate repositories is assumed to be complete. Visual Studio/TFS knowledge is assumed.

Creating the Multi-Example Build Definition

Continuing, we create a new multi-example build definition in Visual Studio to build the multi-example solution.  Customizations and the relevant parts of this build definition are highlighted below. There is also a brief explanation of each customization below the image. The outcome of this build will create the three libraries, package up each into a NuGet package, and push these packages to the Artifactory server. 



\NuGetExample\MultiProjectsExample\multi-example.sln is the solution that will build and store the api.dll, core.dll, and the services.dll.

/p:DistribPackages=true --- This is a custom MSBuild task created inside Nuget.targets file. The DistribPackages task is mean to “push” the Nuget packages from the build site to the local Artifacotry repository.

When NuGet Management is added to a solution in TFS it adds a NuGet.exe, a NuGet.config, and a NuGet.targets file to the Solution File. In this project we check in the NuGet.config and the Nuget.targets file, but not the NuGet.exe.  The NuGet.targets file is customized to contain the MSBuild targets needed to deploy to Artifactory.

\ProcessTemplates\Scripts\UpdateNuGetPackages.ps1 – This is a custom “pre-build” powershell script that is run prior to the compile. The main purpose of this script is to make sure that the latest package of MyLogger and other required binaries are included in the build.

$(BuildDefinitionName)_1.0.3$(Rev:.r) – This will allow the user to set the Major.Minor.Build number.

Build Multi-Example Project with MyLogger from Artifactory


The multi-example.sln project contains three projects – one for the api, the core, and the services. All three consume the mylogger.dll as well as other third party libraries. After the NuGet integration with the multi-example.sln file, we need to add the Newtonsoft.json, the MyLogger.dll, and the Ninject.dll to the installed packages. Although these are found on external repositories, they will all be cached locally in Artifactory and be accessed using the virtual 'repo' repository instead of having to go outside. 

Below is the NuGet Package Manager screen of installed packages for this solution and a screen shot the packages.config dialogue which can be updated for all or some of the projects within the solution.

 

During the build of the multi-example.sln, we run a UpdateNuGetPackages.ps1 script to make sure we have the latest packages needed for each of the three builds. There is a RestorePackages in the NuGet.targets file, but this only restores what is missing in the packages.config files of each project. 
The packages.config is read at the beginning of the build and the version numbers are set at that time. If a new NuGet package has been released the MSBuild will not get the latest because the package.config has not been updated yet. For this reason, we had to use a small pre-build powershell script to update the packages.config files prior to the MSBuild loading these files.  This issue may be resolved in future releases of the build template.
After the multi-example project completes, all the artifacts from the three projects are packaged with NuGet and “pushed” into the nuget-staging-local repository in Artifactory. These newly stored artifacts can be pulled from Artifactory and deployed. Any promotion scheme for the MyLogger project artifacts, may also apply to these artifacts.

Promoting from Staging to Release

After a package has been deployed and tested it can then be approved for promotion to a “release” status. In Artifactory we use a release repository (e.g. nuget-release-local) to store NuGet packages that are ready for production release. Access control for each repository is maintained by changing group and user security settings for each repository.

The move from the local staging repository to the local release repository is a “cheap copy” and does not take up time or disk space – “it happens only in the database”, much like placing a tag on a group of files.

Once the files are in the release repository, other developers who need to work with the latest “released” package, can pull their dependencies from the nuget-release-local repository to ensure they are working with a library that has been fully tested and ready for production.

Below is a sample of the outcome of such a promotion action within Artifactory. The promotions can be made from the GUI or scripted. If desired the scripts can be connected with a test results or an approval signature workflow. By using promotions, one can in the prevent the DevOps group from deploying code from the staging CI builds repository (code that has not been approved by QA) instead of a release repository.

Connecting from Artifactory back to TFS

After the NuGet packages are stored inside of Artifactory, we can return to the build information stored in TFS by doing the following:
   Go to Artifactory -> Artifacts -> NuPkg Info -> click on MyLogger under the General tab
From here, you can review the original build, read the diagnostics logs, review the test that have been run or just review other build details inside of TFS database.

From Artifactory one can have a direct link back to the TFS project and the build that was used to actually pushed into the NuGet package into Artifactory. Below one can see more of the details for this build.


If you wish to see the contents in the TFS drop site for the build found in a NuGet package inside of Artifactory, there is a URL link from the TFS Web Client directly into the drop site.

In addition to the checksums for the NuGet packages stored inside of Artifactory, other properties like the Project URL and the Release Notes URL can also be found.


This integration did not make modifications to the build process templates, did not rely on custom activities, and did not require any custom csharp code. The Powershell script used is a lightweight script that may be deprecated or expanded as the user so desires.

What is the Outcome of all this effort?

Artifactory Pro currently, with little customization, allows one store artifacts (binaries) built using Visual Studio and TFS. Artifactory currently allows developers to host remote build dependencies locally with an on-premise server controlled by the developers. Libraries built for the consumption by other internal development groups can be hosted on the same Artifactory server. These artifacts can then be promoted through a CI pipeline and deployed from an Artifactory release-approved repository into production or to the customer.

For this example, we have modified the standard nuget.target file, added a simple powershell script, and customized the repositories in Artifactory. The same or similar results can be accomplished by using either MSBuild tasks, TFS Build Workflow customizations, powershell scripts, Visual Studio plug-ins or a combination of all of these.

The integration of all these tools will hopefully improve as the demand rises.



 


June 6, 2014

Continuous Integration using TFS, NuGet, and Artifactory

                    “The sky is cracking”

This blog shows how Artifactory, a binary repository manager, can be used a) as the storage location for remotely located build references, b) as a drop site for locally built CI artifacts, c) and in a future blog how it can also function as a storage and distributor of all these binaries.


For this demo we will use the MyLogger solution. MyLogger.sln consumes third-party references during its build process and produces a library exposing a simple logging method. This library which will later be consumed from Artifactory by a second solution in a follow-up blog. The third party remotely located resources used by MyLogger will be cached in Artifactory.

Artifactory, combined with NuGet, allows TFS users to retrieve the latest binaries from their local CI builds while automatically updating referenced binaries coming from multiple remote repositories. Using Artifactory in this manner will limit the downtime of CI builds due to limited internet access.

First, Let's Download the Source Code Example

The source can be found at https://artifactorynugetexample.codeplex.com/

You can clone, fork and clone, or use the 'download' tab to get a zip file of the code. After downloading the source code, create a TFS project (TFVC or GIT) using the Default Scrum Template in TFS 2013. Add the source to the TFS project.

Your source code tree should look like this:


WARNING: You should wait before opening the solution in Visual Studio. NuGet and Artifactory needs to be installed first or there may be some attempts to find the packages from other sources.

Second, Let's Setup of an Artifactory Pro Server

Standard Setup:

For integration with NuGet, the Pro version will be needed. The Java SDK 1.7+ is a requirement for running Artifactory. The Artifactory Pro Server may be located on one's local workstation, the TFS host, an unrelated machine, or even exist in a cloud. For our purpose here our local TFS server will host the Artifactory Pro services.


  1. Download and install Java SDK 1.7+ (create an environment variable for $JAVA_HOME) http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html
  2. Download and install an evaluation copy of Artifactory Pro to run as service. For this purpose we recommend running as a service. In the \bin folder use the artifactory-service.exe https://www.jfrog.com/registration/evaluateaddons.html

Customizations:

After the server is installed with a valid Pro temporary license (license will be emailed from JFrog), and before starting the Artifactory services replace the following two files with the one's provided in the source code download from Codeplex. These two files create the repositories and setup the user accounts.
  • security.xml = $(Artifactory-Home)\etc\security.import.xml
  • artifactory.config.xml = $(Artifactory-Home)\etc\artifactory.config.import.html
To manage TFS related remote and locally produced binaries in Artifactory, we will need some custom repositories (see below - these were automatically created when you replace the artifactory.config.xml file above). A reference of instructions can be found at: http://www.jfrog.com/confluence/display/RTF/NuGet+Repositories


nuget-staging-local --- This repository will be the storage location for the local built libraries prior to QA testing and approval. The latest libraries here can be used for integration into the latest development code for testing.


nuget-release-local --- Repository to store the 'release' versions of all packages. This repository will be the final storage for the locally built and QA tested/approved for consumption or production.


nuget-symbols-local --- Repository for storing the symbols from the CI build. This repository will be the location of the symbols that match the builds found in the nuget-staging-local repository. These symbols will be pulled and deployed for debugging purposes by the developers.


jcenter-cache --- Local repository to 'proxy' the BinTray/JCenter gallery. This repository will cache the libraries that will be consumed in the builds from jcenter.bintray.com. This provides a local source for these libraries and maybe be useful for use with TeamCity plugins and the Artifactory Remote Search.

nuget-gallery-cache --- Local repository to 'proxy' the nuget gallery. This repository will cache the libraries that will be consumed in the builds from nuget.org. This provides a local source for these libraries.

Now, Create a Build Definitions in Visual Studio

After the source code for the MyLogger solution has been downloaded and is in a TFS project, we need to create a build definition.  This build definition is based on the default process template and will use default settings with the exceptions highlighted and described below.

MyLogger Build Definition


\NuGetExample\MyLogger\MyLogger.sln is the solution that will build and store the MyLogger.dll


/p:DistribPackages=true --- This is a custom MSBuild task created inside the Nuget.targets file. The DistribPackages task is mean to "push" the Nuget packages from the build site to the local Artifactory repository.


When NuGet Management is added to a solution in TFS it adds a NuGet.exe, a NuGet.config, and a NuGet.targets file to the Solution File. In this project we check in the NuGet.config and the Nuget.targets file, but not the NuGet.exe.  The NuGet.targets file is customized to contain the MSBuild targets needed to deploy to Artifactory.


\\tfs\builds\symbols -- creates a drop site for the symbols that are generated for this build. In this example we do not capture all the generated symbols and test results.


$(BuildDefinitionName)_2.0.1$(Rev:.r) - This will give allow the user to set the Major.Minor.Build number. This revision number allows TFS to increment the Revision number used in the "get last revision" of MyLogger when building the Multi-Example project.

Time to Integrate NuGet with Visual Studio

After the build definition has been created, we need to integrate the 'NuGet Package Manager' with Visual Studio solutions. Install instructions follow below:
  • In Visual Studio, go to Tools --> NuGet Package Manager --> Package Manager Settings
  • General --> Package Restore - Turn ON "Automatically check for missing packages during build in Visual Studio"

Package Sources - Make sure only the local Artifactory is being used for package sources. We want everything used in the build to be cached inside of the local Artifactory server.  Changes made here will be reflected in the %APPDATA%\Roaming\NuGet\NuGet.config.




By creating the integration of the Nuget Management Packages add-in to a solution within Visual Studio, the follow files have been modified as described below and will need to be checked in. The project and solution files should also be checked in after the integration.

NuGet.config file

The Artifactory server path will be added. Make sure the Artifactory key is pointing to the correct location and only one exists.

     <packageSources>
        <add key="Artifactory" value="http://tfs:8081/artifactory/api/nuget/staging" />
     </packageSources>

Nuget.targets file

The nuget.targets file is where the majority of custom changes were made to implement the integration of Artifactory, NuGet and TFS. These commands and targets can be viewed in the nuget.targets file.

  • New Command UpdateCommand
  • New Command DistribCommand
  • Set Build to Depend on DistribPackages
  • New Target UpdatePackages - updates the packages folder from the packages.config if needed.
  • New Target DistribPackages - runs the NuGet push command to Artifactory
  • New Target TFSEnvVarTest - get the path to msbuild.exe on the build server
  • New Task TFSEnvVar

*.sln file

The nuget project will be added to the solution file.

*.csproj files

During the VS's GUI integration of NuGet to the solution a NuGet project will be imported and a new target will be created in each .csproj file found in the solution:


       <Import Project="$(SolutionDir)\.nuget\NuGet.targets"

       <Target Name="EnsureNuGetPackageBuildImports"

Next, Build MyLogger and Deploy to Artifactory

Verify that the third party reference is coming from Artifactory and is managed by NuGet.
To verify the NuGet packages are actually coming from Artifactory, view the Artifactory \logs\request.log. Since we are doing a clean build the \packages folder will be empty and should now be populated only from Artifactory.




Execute the build definition. If the build definition has been correctly setup, after queuing, we should see a BuildDefinition name followed by the Major, Minor, Build and Revision. After the build completes we will need to review the MSBuild log to verify that the version of Newtonsoft.Json file was correctly updated during the build.



The TFS drop site, if it was setup in the build definition, will contain the logs, the libraries, and the NuGet packages. This drop site has been created for reference. All the files found here will also be found in the nuget package found in the Artifactory nuget-staging-local repository.


In the MSBuild log find a RestorePackages section. References to NuGet packages, regardless of where they originate, are stored in the projects packages.config file parallel to each .csproj file.


Here we are reading the package.config file of each project within the solution and executing a NuGet update command through MSBuild. If packages are missing or a newer version is available, a MsBuild task in the nuget.targets file will download these files to the Packages folder.


During the build of the multiple-example project, we use a pre-build powershell script to execute MSBuld prior to the Try, Compile, so that the packages are updated prior to the compile of the code. Otherwise, the packages.config would already be loaded and will not get updated prior to the build.


The BuildPackage section, in the MSBuild log, will show which NuGet packages have been created during the build and where the references to the dependencies packages are derived from.


The DistribPackages section in the MsBuild log file shows where the NuGet packages were "pushed" to by the NuGet application.


The nuget-staging-local repository is the location where we 'drop' the contents of the build after they have been zipped up into a NuGet package. From this local repository we can reference specific versions to include in other builds, we can pull specific versions to be deployed to an environment for testing, and we can promote from here to another repository such as nuget-release-local for later use in production or to be consumed by other developer teams. These concepts will be demonstrated in a follow-up blog.


This concludes the first section of this blog.

A future continuation blog will cover the promotion of this package to a "release-local" repository, security and access, and the consumption of this package by other builds to produce new packages which will be delivered into a CI production pipeline.

May 30, 2014

JFrog joins the Cloud Foundry Foundation to help open source integration leap forward!

We are excited to join the Cloud Foundry Foundation as a Silver member!


Well, that was a natural move for the frogs. As a true believers in OSS, Integration and cloud as a platform it is just natural for us to join the team and other members of the foundation and offer our value and experience to the community.

JFrog streamlining the software development process by introducing smart and agile techniques of binaries and software packages management. We were founded around an OpenSource product - Artifactory, which now has over 20K installations around the world. Artifactory’s cloud-based offering serves and sponsors many OSS projects and offer the service free of charge to support open source users.

Projects including Pivotal’s Spring, Grails are using Artifactory SaaS version in the last years, and our newest cloud-based social project - Bintray, is serving the OSS community by proving a free distribution platform for projects, including Pivotal’s Spring and Groovy.

So yes, it was just natural for the frogs and we are looking forward to driving the OpenSource community even further forward, collaborating around cloud-based binary management solutions on the cloud foundry platform.



JFrog Co-founder and Chief Architect, Fred Simon was quoted saying: "Driven by Open Source and committed to the community, JFrog has always seen itself as a key influencer in promoting platforms for managing, packing and distributing artifacts," said Fred Simon, Co-founder and Chief Architect of JFrog. "Through Artifactory and Bintray, JFrog already serves the SpringSource, Grails and Groovy communities, we are now see ourselves even more committed as we join the Cloud Foundry Foundation."

April 9, 2014

Private npm Registry With Artifactory

npm registry
The main reason for Node‘s explosive popularity is its thriving ecosystem. Likewise, it’s well understood that the main reason for that ecosystem’s growth is npm, Node’s package manager. npmjs.org usage has skyrocketed with statistics showing over 4 Million packages downloaded a day, and over 68,000 packages publicly available, and the numbers just keep going up. In fact, node.js and npm are now growing at twice the rate of any other software platform today.
Packages per day across popular platforms. (Source: www.modulecounts.com)


With great power, comes great responsibility

NPMJS.ORG NEVER GOES DOWN!

I wish that was true, but I’m afraid like any other big growing system, you should expect growing pains.
Which means that if your builds are relying on npmjs.org, you are entering a world of pain.











The solution should be easy enough:
“The easiest way is to replicate the couch database, and use the same (or similar) design doc to implement the APIs.” (npmjs.org).

Personally, I wouldn’t call that easy, not to mention that it’s a waste of resources:

  • Why would you want to periodically replicate the entire CouchDB when you only need the packages your build uses. Those packages should be lazy-cached on demand! 
  • You now need someone to administer this CouchDB instead of using an out-of-the-box solution. 
  • What about aggregating multiple registries? You’re out of luck there since npm doesn’t currently support multiple registries.
  • What about the security model? You should be able to control who has access to what, and the current security model doesn’t allow you to do that. 



Meet Artifactory, with npm support!

So to answer the needs detailed above, here is what Artifactory can offer:

The basic stuff:

  • Remote repositories to proxy remote npm registries - The most important one would be the npmjs.org registry, but this can be applied to any compatible npm registry. Provides lazy on-demand caching for packages and metadata. 
  • Local repositories to store private npm packages - Easily store and share private npm packages using what we call “Local Repositories”. These packages can be shared easily and safely among internal teams that need them. 

But that’s not all. There’s much more to it when using a smart binary repository manager:

  • Virtual Repositories - No need for the npm client to support multiple registries. Simply define a virtual repository which aggregates the local repositories that contain your in-house packages, and the remote repositories that proxy npmjs.org or any other compatible npm repository. 
  • Authentication and authorization - An enhanced security model which gives you full control over who can download or publish what to where.
  • Searches (including npm search) - Use the inherent npm search command, or utilize Artifactory’s powerful search capabilities such as searching by property or checksum and more.
  • Powerful custom user plugins platform - Enormous flexibility to customize how you work with npm packages. The sky’s the limit. 

So, are you ready to start using Artifactory with npm support? The full documentation is available in our user guide.

You can register to download and install your free 30-day evaluation locally, or use the cloud version with Artifactory Online.

April 3, 2014

Power to the People - Customize and Extend Artifactory with User Plugins

From our experience with thousands of Artifactory users, we know one thing for sure: we don't know better. Every organization does its ALM differently: artifact approval flow, snapshot retention policies, build-to-release flow, governance, required metadata and much, much more - each organization is different. We definitely have some ideas on how the build and deploy process should look, but there are so many things that make your process unique. And that's good. After all, you aren't paid for working within the ideal deployment cycle, but rather for solving a business problem. At least we hope so.

Acknowledging the fact that we don't know better complicates our lives as creators of a binary repository... and not only by hurting our ego. We want to give you the perfect tool for the job, but how can we do it without dictating to you what your job is? The solution is well known - extensions, a.k.a. add-ons, user plugins, you name it.

"OMG!", you might say. "Code! Joy-joy! Finally, an excuse to hack around!" Or "OMG! Code! It's your job to code those things into your product, not mine!" Look, either way, we don't have much choice, do we? When it comes to customization, you have to tell Artifactory what you want it to do. We can only do our best to make it simple for you. So, we developed a simple DSL.

In this post, I'll show you how easy it is to customize Artifactory with user plugins. Here's the story: you want to prevent the download of deprecated artifacts. The deprecation information is attached as a set of  custom properties to the artifacts by some quality-assurance mechanism (or organism).

Let's say, for example, the artifacts to be banned from download are annotated with property deprecated=true. Artifactory allows you to code callbacks that will be executed in response to various events in the system. You can find the list of available callbacks in the User Plugins documentation. So, we are going to write a download plugin and the callback we are looking for is the altResponse. In this callback, we can provide an alternative response instead of the one Artifactory was asked for. Here's the code:

 1 download {
 2     altResponse { request, responseRepoPath ->
 3         def deprecated = repositories.getProperties(responseRepoPath).getFirst('deprecated')
 4         if (deprecated && deprecated.toBoolean()) {
 5             status = 403
 6             message = 'This artifact was deprecated, please use some alternative.'
 7             log.warn "Request was made for deprecated artifact: $responseRepoPath.";
 8         }
 9     }
10 }

10 lines of code. That’s all. Let's examine them. First thing to notice: it's Groovy! If you are into it, good for you, enjoy! If you aren't, don't worry. It's almost like Java, so you'll read it without problems and will be productive from day 0.
So, here we go, line by line:
  1. Declares that it's a download plugin.
  2. Defines the callback type we want (altResponse). When we implement the alternative response, Artifactory provides us with 2 objects:
    • The request, an instance of org.artifactory.request.Request. It encapsulates the information about the incoming request, such as client details and the information requested
    • And responseRepoPath, an instance of org.artifactory.repo.RepoPath. It encapsulates the information about the artifact to be returned.
  3. We want the first value of the 'deprecated' property, if defined on the artifact represented by responseRepoPath.
  4. If the value exists and it is 'true', 1 or 'y' (as declared by Groovy's toBoolean())
  5. set return code to 403 (Forbidden) and
  6. set the correct error message and
  7. optionally, issue a warning to the Artifactory log.
Well, that's all. Now you can see that the dragon of user plugins isn't so scary. Just think about the unique ways you can automate your delivery cycle, apply regulations and checks, or provide your corporate users with a better Artifactory experience. Here are some samples and community-contributed plugins to ignite your imagination.

Enjoy your build!


This is an updated repost of an old and forgotten post, now featuring the latest plugins API!

December 5, 2013

Introducing First Class RubyGems Support in Artifactory

Here's a short and down-to-business screen-cast that shows how to set up a feature-rich hosted Ruby Gems repository. You'll get the full monty - local repositories for sharing your private gems, remote repositories to stop being dependent on rubygems.org and a virtual repository that unifies and simplifies configuration. Of course, it plays awesomely with Jenkins, (by using Jenkins Artifactory Plugin) including the release management functionality.

Make yourself a cup of coffee and spend 6 minutes to get a clue on how powerful Ruby binary management can be.


Your comments and thoughts are welcomed, both here and/or on YouTube. Still not sure why do you need it? Read more.

August 22, 2013

Taking Control of App Releases

Featuring report "Release Management for Enterprises", by RebelLabs

Release Management: More Relevant Than Ever

Today’s software users have rapidly evolving needs, are mobile, and expect 24/7 connectivity and reliability. So dev teams need to churn out new features and versions frequently to keep up while still making sure that service is not interrupted. Sounds like a tall order, but fail to do so, and users switch to competitors or other alternatives.

The dev teams have a multitude of collaboration tools, Kanban boards, build tools and agile practices, and can build features quickly. But once it comes to releasing these changes, the process is more manual, ad hoc and slow. This is why it is important to take a close look at our release processes, streamline them, and take software to users quickly and safely.

Step 1: Tear Down Walls and Collaborate

The primary issue around streamlining release processes in an enterprise is, you guessed it, cultural differences and team fragmentation. Complexity of software projects and the need for specialists have greatly contributed to creating silos. The dev teams focus on develop new features while the ops teams ensure service reliability. Their goals are opposed to each other, thereby driving a wedge between them and further breaking down lines of communication.

Collaboration is vital. Tear down walls and work together on:
  • Critiquing your processes. Look for dependencies and bottlenecks. Address them.
  • Exploring tools for automation. Automate repetitive and error-prone processes.
  • Driving cultural change. Communicate successes and celebrate them.
Look for small opportunities that can bring big improvements. This is the kind of awesome stuff DevOps is made up of :).

Step 2: The Fun Part - Automate!

That's right. Once you have optimized release processes, torn down the fences and felt good doing it, look for opportunities to automate.

Check out this 32 page report by RebelLabs that shows you how you can create an automated release pipeline from scratch using:

  • GitHub: version control system (VCS) where devs check in code
  • Bamboo: CI tool that pulls changes from the VCS to generate builds
  • Arquillian: to run integrated tests on newly generated builds 
  • Selenium: to run acceptance tests on newly generated builds
  • Artifactory: repository to store release artifacts, and build and test results
  • LiveRebel: release automation tool to deploy app updates with zero downtime
The image below is a snapshot of how these freely available tools work together.







Try it out with a new project and practice continuous delivery with all the checks and balances that come with proven release management practices.

Your Next Step

Get the report, read the discussion around release management issues and solutions, and then take control by building your own continuous delivery pipeline!