Text file log provider in ssis




















Step 3: Test the Lesson 3 package. Skip to main content. This browser is no longer supported. Download Microsoft Edge More info.

Contents Exit focus mode. Is this page helpful? Please rate your experience Yes No. Any additional feedback? Note If the state of the Extract Sample Currency Data check box appears dimmed instead of selected, the task uses the log settings of the parent container and you cannot enable the log events that are specific to the task. The default file name extension for this provider is.

The Windows Event log provider, which writes entries to the Application log in the Windows Event log on the local computer. The following table lists the ProgID and ClassID for the log providers that Integration Services includes, and the location of the logs to which log providers write.

You can also create custom log providers. For more information, see Creating a Custom Log Provider. The log providers in a package are members of the log providers collection of the package.

You configure a log provider by providing a name and description for the log provider and specifying the connection manager that the log provider uses. The Windows Event log provider does not use a connection manager, because it writes directly to the Windows Event log. To customize the logging of an event or custom message, Integration Services provides a schema of commonly logged information to include in log entries. The Integration Services log schema defines the information that you can log.

You can select elements from the log schema for each log entry. A package and its containers and tasks do not have to log the same information, and tasks within the same package or container can log different information.

For example, a package can log operator information when the package starts, one task can log the source of the task's failure, and another task can log information when errors occur.

If a package and its containers and tasks use multiple logs, the same information is written to all the logs. You can select a level of logging that suits your needs by specifying the events to log and the information to log for each event.

You may find that some events provide more useful information than others. For example, you might want to log only the computer and operator names for the PreExecute event but all available information for the Error event.

To prevent log files from using large amounts of disk space, or to avoid excessive logging, which could degrade performance, you can limit logging by selecting specific events and information items to log. For example, you can configure a log to capture only the date and the computer name for each error. The following table describes three additional elements in the log schema that are not available on the Details tab of the Configure SSIS Logs dialog box.

Integration Services supports log entries on predefined events and provides custom log entries for many Integration Services objects. The following table describes the predefined events that can be enabled to write log entries when run-time events occur. These log entries apply to executables, the package, and the tasks and containers that the package includes. The name of the log entry is the same as the name of the run-time event that was raised and caused the log entry to be written.

The package and many tasks have custom log entries that can be enabled for logging. For example, the Send Mail task provides the SendMailTaskBegin custom log entry, which logs information when the Send Mail task starts to run, but before the task sends an e-mail message. For more information, see Custom Messages for Logging. Log data includes the name and the GUID of the package to which the log entries belong.

If you create a new package by copying an existing package, the name and the GUID of the existing package are also copied. As a result, you may have two packages that have the same GUID and name, making it difficult to differentiate between the packages in the log data.

To eliminate this ambiguity, you should update the name and the GUID of the new packages. You can also change the GUID and the name programmatically, or by using the dtutil command prompt. For more information, see Set Package Properties and dtutil Utility. Frequently, the logging options of tasks and For Loop, Foreach Loop, and Sequence containers match those of the package or a parent container. In that case, you can configure them to inherit their logging options from their parent container.

To use the parent logging options, you set the LoggingMode property of the container to UseParentSetting. In the Configure SSIS Logs dialog box, you can also create and save frequently used logging configurations as templates, and then use the templates in multiple packages. This makes it easy to apply a consistent logging strategy across multiple packages and to modify log settings on packages by updating and then applying the templates.

The templates are stored in XML files. Enable the package and its tasks for logging. Logging can occur at the package, the container, and the task level. You can specify different logs for packages, containers, and tasks. Select a log provider and add a log for the package. Logs can be created only at the package level, and a task or container must use one of the logs created for the package. Select the events and the log schema information about each event you want to capture in the log.

For information about programmatically setting these properties, see the documentation for the LogProvider class. The Data Flow task provides many custom log entries that can be used to monitor and adjust performance. For example, you can monitor components that might cause memory leaks, or keep track of how long it takes to run a particular component. For a list of these custom log entries and sample logging output, see Data Flow Task.

When you configure an error output in the data flow, by default the error output provides only the numeric identifier of the column in which the error occurred. For more info, see Error Handling in Data.

You can find column names by enabling logging and selecting the DiagnosticEx event. This event writes a data flow lineage map to the log.

You can then look up the column name from its identifier in this lineage map. Note that the DiagnosticEx event does not preserve whitespace in its XML output to reduce the size of the log. Perhaps the most useful custom log entry is the PipelineComponentTime event. This log entry reports the number of milliseconds that each component in the data flow spends on each of the five major processing steps.

The following table describes these processing steps. Integration Services developers will recognize these steps as the principal methods of a PipelineComponent. When you enable the PipelineComponentTime event, Integration Services logs one message for each processing step performed by each component.

The following log entries show a subset of the messages that the Integration Services CalculatedColumns package sample logs:. These log entries show that the data flow task spent the most time on the following steps, shown here in descending order:. The Aggregate transformation that is named "Sum Quantity and LineItemTotalCost" spent a combined ms in PrimeOutput and 79 in ProcessInput-performing calculations and passing the data to the next transformation.

This procedure describes how to add logs to a package, configure package-level logging, and save the logging configuration to an XML file. You can add logs only at the package level, but the package does not have to perform logging to enable logging in the containers that the package includes.

By default, the containers in the package use the same logging configuration as their parent container. For information about setting logging options for individual containers, see Configure Logging by Using a Saved Configuration File. Select a log provider in the Provider type list, and then click Add. Depending on the selected provider, use one of the following connection managers:.

For Text files, use a File connection manager. For more information, see File Connection Manager. Optionally, select the package-level check box, select the logs to use for package-level logging, and then click the Details tab.

On the Details tab, select Events to log all log entries, or clear Events to select individual events. On the Details tab, click Save. The Save As dialog box appears. Locate the folder in which to save the logging configuration, type a file name for the new log configuration, and then click Save. To save the updated package, click Save Selected Items on the File menu.

Configure the Options in the Containers Pane. Configure the Options on the Providers and Logs Tab. Containers Select the check boxes in the hierarchical view to enable the package and its containers for logging:. If dimmed, the container uses the logging options of its parent.

This option is not available for the package. If a container is dimmed and you want to set logging options on the container, click its check box twice. The first click clears the check box, and the second click selects it, enabling you to choose the log providers to use and select the information to log.

Second, you need to define what information should be sent to the defined log providers. To programmatically retrieve the logs, you will use the Executions and Operations properties of the Catalo g class. We will see an example of that in this section.

To demonstrate how you would do this using the UI, open the MomDemo project and make sure package. These providers are selected in the Provider Type combo box and are defined as follows:. This provider requires you to configure a File Connection object that defines the location of the file. Text files are portable, and the CSV format is a simple-to-use industry-wide standard. The file must be specified with a trc file extension so that you can open it using the SQL Profiler diagnostic tool.

Using Profiler, you could view the execution of the package step-by-step, even replaying the steps in a test environment. The first time this package is executed, a table called sysssislog is created automatically. Storing log information in a SQL Server database inherits the benefits of persisting information in a relational database system. You could easily retrieve log information for analysis across multiple package executions. No additional configuration is required for this provider.

Logging package execution to the Windows Event Log is possibly the easiest way to store log events. The Windows Event Log is easy to view and can be viewed remotely if required. The file is specified through a File Connection object. Make sure you save the file with an xml file extension. XML files are very portable across systems and can be validated against a Schema definition.



0コメント

  • 1000 / 1000