Public/New-ChatGPTConversation.ps1

function New-ChatGPTConversation {

    <#
    .SYNOPSIS
        Create a new ChatGPT conversation or get a Chat Completion result.(if you specify the prompt parameter)
    .DESCRIPTION
        Create a new ChatGPT conversation, You can chat with the OpenAI service just like chat with a human. You can also get the chat completion result if you specify the prompt parameter.
    .PARAMETER api_key
        The API key to access OpenAI service, if not specified, the API key will be read from environment variable OPENAI_API_KEY. if you use azure OpenAI service, you can specify the API key by environment variable OPENAI_API_KEY_AZURE or OPENAI_API_KEY_AZURE_<environment>, the <environment> can be any names you want, for example, OPENAI_API_KEY_AZURE_DEV, OPENAI_API_KEY_AZURE_PROD, OPENAI_API_KEY_AZURE_TEST, etc.
    .PARAMETER model
        The model to use for this request, you can also set it in environment variable OPENAI_CHAT_MODEL or OPENAI_CHAT_DEPLOYMENT_AZURE if you use Azure OpenAI service. If you use multiple environments, you can use OPENAI_CHAT_DEPLOYMENT_AZURE_<environment> to define the model for each environment. You can use engine or deployment as the alias of this parameter.
    .PARAMETER endpoint
        The endpoint to use for this request, you can also set it in environment variable OPENAI_ENDPOINT or OPENAI_ENDPOINT_AZURE if you use Azure OpenAI service. If you use multiple environments, you can use OPENAI_ENDPOINT_AZURE_<environment> to define the endpoint for each environment.
    .PARAMETER azure
        if you use Azure OpenAI service, you can use this switch.
    .PARAMETER system
        The system prompt, this is a string, you can use it to define the role you want it be, for example, "You are a chatbot, please answer the user's question according to the user's language."
        If you provide a file path to this parameter, we will read the file as the system prompt.
        You can also specify a url to this parameter, we will read the url as the system prompt.
        You can read the prompt from a library (https://github.com/code365opensource/promptlibrary), by use "lib:xxxxx" as the prompt, for example, "lib:fitness".
    .PARAMETER prompt
        If you want to get result immediately, you can use this parameter to define the prompt. It will not start the chat conversation.
        If you provide a file path to this parameter, we will read the file as the prompt.
        You can also specify a url to this parameter, we will read the url as the prompt.
        You can read the prompt from a library (https://github.com/code365opensource/promptlibrary), by use "lib:xxxxx" as the prompt, for example, "lib:fitness".
    .PARAMETER config
        The dynamic settings for the API call, it can meet all the requirement for each model. please pass a custom object to this parameter, like @{temperature=1;max_tokens=1024}.
    .PARAMETER environment
        The environment name, if you use Azure OpenAI service, you can use this parameter to define the environment name, it will be used to get the API key, model and endpoint from environment variable. If the environment is not exist, it will use the default environment.
        You can use env as the alias of this parameter.
    .PARAMETER api_version
        The api version, if you use Azure OpenAI service, you can use this parameter to define the api version, the default value is 2023-09-01-preview.
    .PARAMETER outFile
        If you want to save the result to a file, you can use this parameter to set the file path. You can also use "out" as the alias.
    .PARAMETER local
        If you want to use the local LLMs, like the model hosted by ollama, you can use this switch. You can also use "ollama" as the alias.
    .PARAMETER context
        If you want to pass some dymamic value to the prompt, you can use the context parameter here. It can be anything, you just specify a custom powershell object here. You define the variables in the system prompt or user prompt by using {{you_variable_name}} syntext, and then pass the data to the context parameter, like @{you_variable_name="your value"}. if there are multiple variables, you can use @{variable1="value1";variable2="value2"}.
    .PARAMETER json
        Send the response in json format.
    .EXAMPLE
        New-ChatGPTConversation
        Create a new ChatGPT conversation, use OpenAI service with all the default settings.
    .EXAMPLE
        New-ChatGPTConverstaion -azure
        Create a new ChatGPT conversation, use Azure OpenAI service with all the default settings.
    .EXAMPLE
        chat -azure
        Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure OpenAI service with all the default settings.
    .EXAMPLE
        New-ChatGPTConversation -api_key "your API key" -model "your model name"
        Create a new ChatGPT conversation, use OpenAI service with your API key and model name.
    .EXAMPLE
        New-ChatGPTConversation -api_key "your API key" -model "your deployment name" -azure
        Create a new ChatGPT conversation, use Azure OpenAI service with your API key and deployment name.
    .EXAMPLE
        New-ChatGPTConversation -api_key "your API key" -model "your deployment name" -azure -system "You are a chatbot, please answer the user's question according to the user's language."
        Create a new ChatGPT conversation, use Azure OpenAI service with your API key and deployment name, and define the system prompt.
    .EXAMPLE
        New-ChatGPTConversation -api_key "your API key" -model "your deployment name" -azure -system "You are a chatbot, please answer the user's question according to the user's language." -endpoint "https://api.openai.com/v1/completions"
        Create a new ChatGPT conversation, use Azure OpenAI service with your API key and model id, and define the system prompt and endpoint.
    .EXAMPLE
        chat -azure -system "You are a chatbot, please answer the user's question according to the user's language." -env "sweden"
        Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure OpenAI service with the API key, model and endpoint defined in environment variable OPENAI_API_KEY_AZURE_SWEDEN, OPENAI_CHAT_DEPLOYMENT_AZURE_SWEDEN and OPENAI_ENDPOINT_AZURE_SWEDEN.
    .EXAMPLE
        chat -azure -api_version "2021-09-01-preview"
        Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure OpenAI service with the api version 2021-09-01-preview.
    .EXAMPLE
        gpt -azure -prompt "why people smile"
        Create a new ChatGPT conversation by cmdlet's alias(gpt), use Azure OpenAI service with the prompt.
    .EXAMPLE
        "why people smile" | gpt -azure
        Create a new ChatGPT conversation by cmdlet's alias(gpt), use Azure OpenAI service with the prompt from pipeline.
    .EXAMPLE
        gpt -azure -prompt "c:\temp\prompt.txt"
        Create a new ChatGPT conversation by cmdlet's alias(gpt), use Azure OpenAI service with the prompt from file.
    .EXAMPLE
        gpt -azure -prompt "c:\temp\prompt.txt" -context @{variable1="value1";variable2="value2"}
        Create a new ChatGPT conversation by cmdlet's alias(gpt), use Azure OpenAI service with the prompt from file, pass some data to the prompt.
    .EXAMPLE
        gpt -azure -system "c:\temp\system.txt" -prompt "c:\temp\prompt.txt"
        Create a new ChatGPT conversation by cmdlet's alias(gpt), use Azure OpenAI service with the system prompt and prompt from file.
    .EXAMPLE
        gpt -azure -system "c:\temp\system.txt" -prompt "c:\temp\prompt.txt" -outFile "c:\temp\result.txt"
        Create a new ChatGPT conversation by cmdlet's alias(gpt), use Azure OpenAI service with the system prompt and prompt from file, then save the result to a file.
    .EXAMPLE
        gpt -azure -system "c:\temp\system.txt" -prompt "c:\temp\prompt.txt" -config @{temperature=1;max_tokens=1024}
        Create a new ChatGPT conversation by cmdlet's alias(gpt), use Azure OpenAI service with the system prompt and prompt from file and your customized settings.
    .EXAMPLE
        chat -local -model "llama3"
        Create a new ChatGPT conversation by using local LLMs, for example, the llama3. The default endpoint is http://localhost:11434/v1/chat/completions. You can modify this endpoint as well.
    .OUTPUTS
        System.String, the completion result.
    .LINK
        https://github.com/chenxizhang/openai-powershell
    #>



    [CmdletBinding(DefaultParameterSetName = "default")]
    [Alias("chatgpt")][Alias("chat")][Alias("gpt")]
    param(
        [Parameter(ParameterSetName = "local", Mandatory = $true)]
        [Alias("ollama")]
        [switch]$local,
        [Parameter(ParameterSetName = "azure", Mandatory = $true)]
        [switch]$azure,
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]
        [string]$api_key,
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local", Mandatory = $true)]
        [Alias("engine", "deployment")]
        [string]$model,
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local")]
        [string]$endpoint,
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local")]
        [string]$system = "You are a chatbot, please answer the user's question according to the user's language.",
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local")]
        [Parameter(ValueFromPipeline = $true)]
        [string]$prompt = "",
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local")]
        [PSCustomObject]$config,
        [Parameter(ParameterSetName = "azure")]
        [Alias("env")]
        [string]$environment,
        [Parameter(ParameterSetName = "azure")]
        [string]$api_version = "2023-09-01-preview",   
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local")]
        [Alias("out")]   
        [string]$outFile,
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local")]
        [switch]$json,
        [Parameter(ParameterSetName = "default")]
        [Parameter(ParameterSetName = "azure")]    
        [Parameter(ParameterSetName = "local")]
        [PSCustomObject]$context
    )
    BEGIN {

        Write-Verbose ($resources.verbose_parameters_received -f ($PSBoundParameters | Out-String))
        Write-Verbose ($resources.verbose_environment_received -f (Get-ChildItem Env:OPENAI_* | Out-String))

        switch ($PSCmdlet.ParameterSetName) {
            "default" {
                $api_key = if ($api_key) { $api_key } else { $env:OPENAI_API_KEY }
                $model = if ($model) { $model } else { if ($env:OPENAI_CHAT_MODEL) { $env:OPENAI_CHAT_MODEL }else { "gpt-3.5-turbo" } }
                $endpoint = if ($endpoint) { $endpoint } else { "https://api.openai.com/v1/chat/completions" }
            }
            "azure" {
                $api_key = if ($api_key) { $api_key } else { Get-FirstNonNullItemInArray("OPENAI_API_KEY_AZURE_$environment", "OPENAI_API_KEY_AZURE") }
                $model = if ($model) { $model } else { Get-FirstNonNullItemInArray("OPENAI_CHAT_DEPLOYMENT_AZURE_$environment", "OPENAI_CHAT_DEPLOYMENT_AZURE") }
                $endpoint = if ($endpoint) { "{0}openai/deployments/$model/chat/completions?api-version=$api_version" -f $endpoint } else { "{0}openai/deployments/$model/chat/completions?api-version=$api_version" -f (Get-FirstNonNullItemInArray("OPENAI_ENDPOINT_AZURE_$environment", "OPENAI_ENDPOINT_AZURE")) }
            }
            "local" {
                $endpoint = if ($endpoint) { $endpoint }else { "http://localhost:11434/v1/chat/completions" }
                $api_key = if ($api_key) { $api_key } else { "local" }
            }
        }

        Write-Verbose ($resources.verbose_parameters_parsed -f $api_key, $model, $endpoint)

        $hasError = $false

        if ((!$azure) -and ((Test-OpenAIConnectivity) -eq $False)) {
            Write-Error $resources.openai_unavaliable
            $hasError = $true
        }


        if (!$api_key) {
            Write-Error $resources.error_missing_api_key
            $hasError = $true
        }

        if (!$model) {
            Write-Error $resources.error_missing_engine
            $hasError = $true
        }
    }

    PROCESS {

        if ($hasError) {
            return
        }

        $telemetries = @{
            type = $PSCmdlet.ParameterSetName
        }

        # if prompt is not empty and it is a file, then read the file as the prompt
        $parsedprompt = Get-PromptContent($prompt)
        $prompt = $parsedprompt.content

        # if user provide the context, inject the data into the prompt by replace the context key with the context value
        if ($context) {
            Write-Verbose ($resources.verbose_context_received -f ($context | ConvertTo-Json -Depth 10))
            foreach ($key in $context.keys) {
                $prompt = $prompt -replace "{{$key}}", $context[$key]
            }
            Write-Verbose ($resources.verbose_prompt_context_injected -f $prompt)
        }

        $telemetries.Add("promptType", $parsedprompt.type)
        $telemetries.Add("promptLib", $parsedprompt.lib)

        # if system is not empty and it is a file, then read the file as the system prompt
        $parsedsystem = Get-PromptContent($system)
        $system = $parsedsystem.content

        # if user provide the context, inject the data into the system prompt by replace the context key with the context value
        if ($context) {
            Write-Verbose ($resources.verbose_context_received -f ($context | ConvertTo-Json -Depth 10))
            foreach ($key in $context.keys) {
                $system = $system -replace "{{$key}}", $context[$key]
            }
            Write-Verbose ($resources.verbose_prompt_context_injected -f $system)
        }

        $telemetries.Add("systemPromptType", $parsedsystem.type)
        $telemetries.Add("systemPromptLib", $parsedsystem.lib)

        # collect the telemetry data
        Submit-Telemetry -cmdletName $MyInvocation.MyCommand.Name -innovationName $MyInvocation.InvocationName -props $telemetries

        if ($PSBoundParameters.Keys.Contains("prompt")) {
            Write-Verbose ($resources.verbose_prompt_mode -f $prompt)
            $messages = @(
                @{
                    role    = "system"
                    content = $system
                },
                @{
                    role    = "user"
                    content = $prompt
                }
            ) 

            $params = @{
                Uri         = $endpoint
                Method      = "POST"
                Body        = @{model = "$model"; messages = $messages }
                Headers     = if ($azure) { @{"api-key" = "$api_key" } } else { @{"Authorization" = "Bearer $api_key" } }
                ContentType = "application/json;charset=utf-8"
            }

            if ($json) {
                $params.Body.Add("response_format" , @{type = "json_object" } )
            }


            if ($config) {
                Merge-Hashtable -table1 $params.Body -table2 $config
            }

            $params.Body = ($params.Body | ConvertTo-Json -Depth 10)

            Write-Verbose ($resources.verbose_prepare_params -f ($params | ConvertTo-Json -Depth 10))

            $response = Invoke-RestMethod @params

            if ($PSVersionTable['PSVersion'].Major -eq 5) {
                Write-Verbose ($resources.verbose_powershell_5_utf8)

                $dstEncoding = [System.Text.Encoding]::GetEncoding('iso-8859-1')
                $srcEncoding = [System.Text.Encoding]::UTF8

                $response.choices | ForEach-Object {
                    $_.message.content = $srcEncoding.GetString([System.Text.Encoding]::Convert($srcEncoding, $dstEncoding, $srcEncoding.GetBytes($_.message.content)))
                }

            }
            Write-Verbose ($resources.verbose_response_utf8 -f ($response | ConvertTo-Json -Depth 10))

            $result = $response.choices[0].message.content
            Write-Verbose ($resources.verbose_response_plain_text -f $result)

            #if user specify the outfile, write the response to the file
            if ($outFile) {
                Write-Verbose ($resources.verbose_outfile_specified -f $outFile)
                $result | Out-File -FilePath $outFile -Encoding utf8
            }
            else {
                Write-Verbose ($resources.verbose_outfile_not_specified)
                Write-Output $result

                # if user does not specify the outfile, copy the response to clipboard
                # Set-Clipboard $result
                # Write-Host "Copied the response to clipboard." -ForegroundColor Green
            }

        }
        else {
            Write-Verbose ($resources.verbose_chat_mode)

            $stream = ($PSVersionTable['PSVersion'].Major -gt 5)

            $index = 1; 
            $welcome = "`n{0}`n{1}" -f ($resources.welcome_chatgpt -f $(if ($azure) { " $($resources.azure_version) " } else { "" }), $model), $resources.shortcuts
    
            Write-Host $welcome -ForegroundColor Yellow
            Write-Host $system -ForegroundColor Cyan
    
            $messages = @()
            $systemPrompt = @(
                [PSCustomObject]@{
                    role    = "system"
                    content = $system
                }
            )

            Write-Verbose "Prepare the system prompt: $($systemPrompt|ConvertTo-Json -Depth 10)"
            
            while ($true) {
                Write-Verbose ($resources.verbose_chat_let_chat)

                $current = $index++
                $prompt = Read-Host -Prompt "`n[$current] $($resources.prompt)"
                Write-Verbose "Prompt received: $prompt"
    
                if ($prompt -in ("q", "bye")) {
                    Write-Verbose ($resources.verbose_chat_q_message -f $prompt)
                    break
                }
    
                if ($prompt -eq "m") {

                    $os = [System.Environment]::OSVersion.Platform

                    if ($os -notin @([System.PlatformID]::Win32NT, [System.PlatformID]::Win32Windows, [System.PlatformID]::Win32S)) {
                        Write-Host ($resources.verbose_chat_m_message_not_supported)
                        continue
                    }

                    Write-Verbose ($resources.verbose_chat_m_message)
                    $prompt = Read-MultiLineInputBoxDialog -Message $resources.multi_line_prompt -WindowTitle $resources.multi_line_prompt -DefaultText ""

                    Write-Verbose ($resources.verbose_prompt_received -f $prompt)

                    if ($null -eq $prompt) {
                        Write-Host $resources.cancel_button_message
                        continue
                    }
                    else {
                        Write-Host "$($resources.multi_line_message)`n$prompt"
                    }
                }
    
                if ($prompt -eq "f") {

                    $os = [System.Environment]::OSVersion.Platform

                    if ($os -notin @([System.PlatformID]::Win32NT, [System.PlatformID]::Win32Windows, [System.PlatformID]::Win32S)) {
                        Write-Host ($resources.verbose_chat_f_message_not_supported)
                        continue
                    }

                    Write-Verbose ($resources.verbose_chat_f_message)
    
                    $file = Read-OpenFileDialog -WindowTitle $resources.file_prompt

                    Write-Verbose ($resources.verbose_chat_file_read -f $file)
    
                    if (!($file)) {
                        Write-Host $resources.cancel_button_message
                        continue
                    }
                    else {
                        $prompt = Get-Content $file -Encoding utf8
                        Write-Host "$($resources.multi_line_message)`n$prompt"
                    }
                }
    
                $messages += [PSCustomObject]@{
                    role    = "user"
                    content = $prompt
                }

                Write-Verbose ($resources.verbose_prepare_messages -f ($messages | ConvertTo-Json -Depth 10))
    
                $params = @{
                    Uri         = $endpoint
                    Method      = "POST"
                    Body        = @{model = "$model"; messages = ($systemPrompt + $messages[-5..-1]); stream = $stream } 
                    Headers     = if ($azure) { @{"api-key" = "$api_key" } } else { @{"Authorization" = "Bearer $api_key" } }
                    ContentType = "application/json;charset=utf-8"
                }

                if ($json) {
                    $params.Body.Add("response_format" , @{type = "json_object" } )
                }


                if ($config) {
                    Merge-Hashtable -table1 $params.Body -table2 $config
                }
                $params.Body = ($params.Body | ConvertTo-Json -Depth 10)


                Write-Verbose ($resources.verbose_prepare_params -f ($params | ConvertTo-Json -Depth 10))
    
                try {
    
                    if ($stream) {
                        Write-Verbose ($resources.verbose_chat_stream_mode)
                        $client = New-Object System.Net.Http.HttpClient
                        $body = $params.Body
                        Write-Verbose "body: $body"
    
                        $request = [System.Net.Http.HttpRequestMessage]::new()
                        $request.Method = "POST"
                        $request.RequestUri = $params.Uri
                        $request.Headers.Clear()
                        $request.Content = [System.Net.Http.StringContent]::new(($body), [System.Text.Encoding]::UTF8)
                        $request.Content.Headers.Clear()
                        $request.Content.Headers.Add("Content-Type", "application/json;charset=utf-8")

                        if ($azure) {
                            $request.Headers.Add("api-key", $api_key)
                        }
                        else {
                            $request.Headers.Add("Authorization", "Bearer $api_key")
                        }
                                            
                        $task = $client.Send($request)
                        $response = $task.Content.ReadAsStream()
                        $reader = [System.IO.StreamReader]::new($response)
                        $result = "" # message from the api
                        Write-Host -ForegroundColor Red "`n[$current] " -NoNewline
    
                        while ($true) {
                            $line = $reader.ReadLine()
                            if (($line -eq $null) -or ($line -eq "data: [DONE]")) { break }
    
                            $chunk = ($line -replace "data: ", "" | ConvertFrom-Json).choices.delta.content
                            Write-Host $chunk -NoNewline -ForegroundColor Green
                            Write-Verbose ($resources.verbose_chat_stream_chunk_received -f $chunk)
                            $result += $chunk
    
                            Start-Sleep -Milliseconds 50
                        }
                        $reader.Close()
                        $reader.Dispose()
    
                        $messages += [PSCustomObject]@{
                            role    = "assistant"
                            content = $result
                        }

                        Write-Verbose ($resources.verbose_chat_message_combined -f ($messages | ConvertTo-Json -Depth 10))
                        Write-Host ""
    
                    }
                    else {

                        Write-Verbose ($resources.verbose_chat_not_stream_mode)
                        $stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
                        $response = Invoke-RestMethod @params
                        Write-Verbose ($resources.verbose_chat_response_received -f ($response | ConvertTo-Json -Depth 10))

                        $stopwatch.Stop()
                        $result = $response.choices[0].message.content
                        $total_tokens = $response.usage.total_tokens
                        $prompt_tokens = $response.usage.prompt_tokens
                        $completion_tokens = $response.usage.completion_tokens
        
                        Write-Verbose ($resources.verbose_chat_response_summary -f $result, $total_tokens, $prompt_tokens, $completion_tokens)
        
                        if ($PSVersionTable['PSVersion'].Major -le 5) {

                            Write-Verbose ($resources.verbose_powershell_5_utf8)

                            $dstEncoding = [System.Text.Encoding]::GetEncoding('iso-8859-1')
                            $srcEncoding = [System.Text.Encoding]::UTF8
                            $result = $srcEncoding.GetString([System.Text.Encoding]::Convert($srcEncoding, $dstEncoding, $srcEncoding.GetBytes($result)))

                            Write-Verbose ($resouces.verbose_response_utf8 -f $result)
                        }
        
                        $messages += [PSCustomObject]@{
                            role    = "assistant"
                            content = $result
                        }

                        Write-Verbose ($resources.verbose_chat_message_combined -f ($messages | ConvertTo-Json -Depth 10))
                
        
                        Write-Host -ForegroundColor Red ("`n[$current] $($resources.response)" -f $total_tokens, $prompt_tokens, $completion_tokens )
                        
                        Write-Host $result -ForegroundColor Green
                    }
                }
                catch {
                    Write-Error $_
                }
            }
        }


    }

}

# SIG # Begin signature block
# MIIc/gYJKoZIhvcNAQcCoIIc7zCCHOsCAQExDzANBglghkgBZQMEAgEFADB5Bgor
# BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG
# KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCB+Vz6jHEw9dMVD
# /3wDxYmq0sJqvmBQ3C3fI/8PBoxGeKCCAyowggMmMIICDqADAgECAhBcsg5m3zM9
# kUZxmeNzIQNjMA0GCSqGSIb3DQEBCwUAMCoxKDAmBgNVBAMMH0NIRU5YSVpIQU5H
# IC0gQ29kZSBTaWduaW5nIENlcnQwIBcNMjQwMTA4MTMwMjA0WhgPMjA5OTEyMzEx
# NjAwMDBaMCoxKDAmBgNVBAMMH0NIRU5YSVpIQU5HIC0gQ29kZSBTaWduaW5nIENl
# cnQwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDKDY3QG81JOKZG9jTb
# QriDMDhq6gy93Pmoqgav9wErj+CgVvXKk+lGpUu74MWVyLUrJx8/ACb4b287wsXx
# mQj8zQ3SqGn5CCjPKoAPsSbry0LOSl8bsFpwBr3YBJVL6cibhus2KLCbNu/u7sND
# wyivKXYA1Iy1uTQPNVPcBx36krZTZyyE4CmngO75YbTMEzvHEjM3BIXdKtEt673t
# iNOVSP6doh0zRwWEh2Y/eoOpv+FUokORwhKonxMtmIIET+ZPx7Ex+9aqHrliEabx
# FsN4ETnuVT3rST++7Q2fquWFnl5scDnisFhU8JL8k+OGUzpLlo/nOpiRZkbKCEkZ
# FCLhAgMBAAGjRjBEMA4GA1UdDwEB/wQEAwIHgDATBgNVHSUEDDAKBggrBgEFBQcD
# AzAdBgNVHQ4EFgQUwcR3UUOZ6TxpBp9MxnBygyIMhUQwDQYJKoZIhvcNAQELBQAD
# ggEBADwiE9nowKxUNN84BTk9an1ZkdU95ouj+q6MRbafH08u4XV7CxXpkPR8Za/c
# BJWTOqCuz9pMPo0TylqWPm+++Tqy1OJ7Qewvy1+DXPuFGkTqY721uZ+YsHY3CueC
# VSRZRNsWSYE9UxXXFRsjDu/M3+EvyaNDE4xQkwrP8obFJoHq7WaOCCD2wMbKjLb5
# bS/VgtOK7Yn9pU/ghrW+Em+zHOX87wNRh/I5jd+LsnY8bR6REzgdmogIyvD4dsJD
# /IZLxRtbm2BHOn/aGBdu+GpEaYEEb6VkWcJhrQnpiNjjlu43CbRz5Bw14XPWGUDH
# +EkUqkWS4h8zsRiyvR9Pnwklg6UxghkqMIIZJgIBATA+MCoxKDAmBgNVBAMMH0NI
# RU5YSVpIQU5HIC0gQ29kZSBTaWduaW5nIENlcnQCEFyyDmbfMz2RRnGZ43MhA2Mw
# DQYJYIZIAWUDBAIBBQCgfDAQBgorBgEEAYI3AgEMMQIwADAZBgkqhkiG9w0BCQMx
# DAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYKKwYBBAGCNwIBFTAvBgkq
# hkiG9w0BCQQxIgQgGRD5d1vvn41om8V8fKAo6oZ7+Vpap7ehnRMh65VMDQowDQYJ
# KoZIhvcNAQEBBQAEggEAPZ6H8u69yhZwbLqPa/40JS834mlabpUWwG8zRt80z7Cy
# eXIatTcHxtxghpF+9crYG7Hng7KP2mor7kapZM/YXlfH8A35IYBN3xczpOyInaXL
# vyUOZfU4YNmQ7QbxJo/RJuVZ0SnP5xVgqrweX0oWr5055wfBZH7X0AYeVzF0mbqf
# k3dZbmiHam5PYuEcpXSoYEQ6dVRVeHioaYRWC5bx5f1LqUNf7YPtelc6Lc35wkXO
# maBm4kO1hcYk5LaFI3e5qWyRBeJ8cTC+5XtrMFzjNR2oRQrwHs00VwufJoQHfvH6
# x57Et268W0Yt1n5D4Harxx3ZYnm8omR9fAlv3+QscaGCFz8wghc7BgorBgEEAYI3
# AwMBMYIXKzCCFycGCSqGSIb3DQEHAqCCFxgwghcUAgEDMQ8wDQYJYIZIAWUDBAIB
# BQAwdwYLKoZIhvcNAQkQAQSgaARmMGQCAQEGCWCGSAGG/WwHATAxMA0GCWCGSAFl
# AwQCAQUABCDfNbnS4hLUQsVQif06sqcdFgC/+t1hUrGm8ehF9/5eXgIQKh2T73s3
# CsfGDc5wTcBfTBgPMjAyNDA1MDEwNjE1MzFaoIITCTCCBsIwggSqoAMCAQICEAVE
# r/OUnQg5pr/bP1/lYRYwDQYJKoZIhvcNAQELBQAwYzELMAkGA1UEBhMCVVMxFzAV
# BgNVBAoTDkRpZ2lDZXJ0LCBJbmMuMTswOQYDVQQDEzJEaWdpQ2VydCBUcnVzdGVk
# IEc0IFJTQTQwOTYgU0hBMjU2IFRpbWVTdGFtcGluZyBDQTAeFw0yMzA3MTQwMDAw
# MDBaFw0zNDEwMTMyMzU5NTlaMEgxCzAJBgNVBAYTAlVTMRcwFQYDVQQKEw5EaWdp
# Q2VydCwgSW5jLjEgMB4GA1UEAxMXRGlnaUNlcnQgVGltZXN0YW1wIDIwMjMwggIi
# MA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoICAQCjU0WHHYOOW6w+VLMj4M+f1+XS
# 512hDgncL0ijl3o7Kpxn3GIVWMGpkxGnzaqyat0QKYoeYmNp01icNXG/OpfrlFCP
# HCDqx5o7L5Zm42nnaf5bw9YrIBzBl5S0pVCB8s/LB6YwaMqDQtr8fwkklKSCGtpq
# utg7yl3eGRiF+0XqDWFsnf5xXsQGmjzwxS55DxtmUuPI1j5f2kPThPXQx/ZILV5F
# dZZ1/t0QoRuDwbjmUpW1R9d4KTlr4HhZl+NEK0rVlc7vCBfqgmRN/yPjyobutKQh
# ZHDr1eWg2mOzLukF7qr2JPUdvJscsrdf3/Dudn0xmWVHVZ1KJC+sK5e+n+T9e3M+
# Mu5SNPvUu+vUoCw0m+PebmQZBzcBkQ8ctVHNqkxmg4hoYru8QRt4GW3k2Q/gWEH7
# 2LEs4VGvtK0VBhTqYggT02kefGRNnQ/fztFejKqrUBXJs8q818Q7aESjpTtC/XN9
# 7t0K/3k0EH6mXApYTAA+hWl1x4Nk1nXNjxJ2VqUk+tfEayG66B80mC866msBsPf7
# Kobse1I4qZgJoXGybHGvPrhvltXhEBP+YUcKjP7wtsfVx95sJPC/QoLKoHE9nJKT
# BLRpcCcNT7e1NtHJXwikcKPsCvERLmTgyyIryvEoEyFJUX4GZtM7vvrrkTjYUQfK
# lLfiUKHzOtOKg8tAewIDAQABo4IBizCCAYcwDgYDVR0PAQH/BAQDAgeAMAwGA1Ud
# EwEB/wQCMAAwFgYDVR0lAQH/BAwwCgYIKwYBBQUHAwgwIAYDVR0gBBkwFzAIBgZn
# gQwBBAIwCwYJYIZIAYb9bAcBMB8GA1UdIwQYMBaAFLoW2W1NhS9zKXaaL3WMaiCP
# nshvMB0GA1UdDgQWBBSltu8T5+/N0GSh1VapZTGj3tXjSTBaBgNVHR8EUzBRME+g
# TaBLhklodHRwOi8vY3JsMy5kaWdpY2VydC5jb20vRGlnaUNlcnRUcnVzdGVkRzRS
# U0E0MDk2U0hBMjU2VGltZVN0YW1waW5nQ0EuY3JsMIGQBggrBgEFBQcBAQSBgzCB
# gDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQuY29tMFgGCCsGAQUF
# BzAChkxodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGlnaUNlcnRUcnVzdGVk
# RzRSU0E0MDk2U0hBMjU2VGltZVN0YW1waW5nQ0EuY3J0MA0GCSqGSIb3DQEBCwUA
# A4ICAQCBGtbeoKm1mBe8cI1PijxonNgl/8ss5M3qXSKS7IwiAqm4z4Co2efjxe0m
# gopxLxjdTrbebNfhYJwr7e09SI64a7p8Xb3CYTdoSXej65CqEtcnhfOOHpLawkA4
# n13IoC4leCWdKgV6hCmYtld5j9smViuw86e9NwzYmHZPVrlSwradOKmB521BXIxp
# 0bkrxMZ7z5z6eOKTGnaiaXXTUOREEr4gDZ6pRND45Ul3CFohxbTPmJUaVLq5vMFp
# GbrPFvKDNzRusEEm3d5al08zjdSNd311RaGlWCZqA0Xe2VC1UIyvVr1MxeFGxSjT
# redDAHDezJieGYkD6tSRN+9NUvPJYCHEVkft2hFLjDLDiOZY4rbbPvlfsELWj+MX
# kdGqwFXjhr+sJyxB0JozSqg21Llyln6XeThIX8rC3D0y33XWNmdaifj2p8flTzU8
# AL2+nCpseQHc2kTmOt44OwdeOVj0fHMxVaCAEcsUDH6uvP6k63llqmjWIso765qC
# NVcoFstp8jKastLYOrixRoZruhf9xHdsFWyuq69zOuhJRrfVf8y2OMDY7Bz1tqG4
# QyzfTkx9HmhwwHcK1ALgXGC7KP845VJa1qwXIiNO9OzTF/tQa/8Hdx9xl0RBybhG
# 02wyfFgvZ0dl5Rtztpn5aywGRu9BHvDwX+Db2a2QgESvgBBBijCCBq4wggSWoAMC
# AQICEAc2N7ckVHzYR6z9KGYqXlswDQYJKoZIhvcNAQELBQAwYjELMAkGA1UEBhMC
# VVMxFTATBgNVBAoTDERpZ2lDZXJ0IEluYzEZMBcGA1UECxMQd3d3LmRpZ2ljZXJ0
# LmNvbTEhMB8GA1UEAxMYRGlnaUNlcnQgVHJ1c3RlZCBSb290IEc0MB4XDTIyMDMy
# MzAwMDAwMFoXDTM3MDMyMjIzNTk1OVowYzELMAkGA1UEBhMCVVMxFzAVBgNVBAoT
# DkRpZ2lDZXJ0LCBJbmMuMTswOQYDVQQDEzJEaWdpQ2VydCBUcnVzdGVkIEc0IFJT
# QTQwOTYgU0hBMjU2IFRpbWVTdGFtcGluZyBDQTCCAiIwDQYJKoZIhvcNAQEBBQAD
# ggIPADCCAgoCggIBAMaGNQZJs8E9cklRVcclA8TykTepl1Gh1tKD0Z5Mom2gsMyD
# +Vr2EaFEFUJfpIjzaPp985yJC3+dH54PMx9QEwsmc5Zt+FeoAn39Q7SE2hHxc7Gz
# 7iuAhIoiGN/r2j3EF3+rGSs+QtxnjupRPfDWVtTnKC3r07G1decfBmWNlCnT2exp
# 39mQh0YAe9tEQYncfGpXevA3eZ9drMvohGS0UvJ2R/dhgxndX7RUCyFobjchu0Cs
# X7LeSn3O9TkSZ+8OpWNs5KbFHc02DVzV5huowWR0QKfAcsW6Th+xtVhNef7Xj3OT
# rCw54qVI1vCwMROpVymWJy71h6aPTnYVVSZwmCZ/oBpHIEPjQ2OAe3VuJyWQmDo4
# EbP29p7mO1vsgd4iFNmCKseSv6De4z6ic/rnH1pslPJSlRErWHRAKKtzQ87fSqEc
# azjFKfPKqpZzQmiftkaznTqj1QPgv/CiPMpC3BhIfxQ0z9JMq++bPf4OuGQq+nUo
# JEHtQr8FnGZJUlD0UfM2SU2LINIsVzV5K6jzRWC8I41Y99xh3pP+OcD5sjClTNfp
# mEpYPtMDiP6zj9NeS3YSUZPJjAw7W4oiqMEmCPkUEBIDfV8ju2TjY+Cm4T72wnSy
# Px4JduyrXUZ14mCjWAkBKAAOhFTuzuldyF4wEr1GnrXTdrnSDmuZDNIztM2xAgMB
# AAGjggFdMIIBWTASBgNVHRMBAf8ECDAGAQH/AgEAMB0GA1UdDgQWBBS6FtltTYUv
# cyl2mi91jGogj57IbzAfBgNVHSMEGDAWgBTs1+OC0nFdZEzfLmc/57qYrhwPTzAO
# BgNVHQ8BAf8EBAMCAYYwEwYDVR0lBAwwCgYIKwYBBQUHAwgwdwYIKwYBBQUHAQEE
# azBpMCQGCCsGAQUFBzABhhhodHRwOi8vb2NzcC5kaWdpY2VydC5jb20wQQYIKwYB
# BQUHMAKGNWh0dHA6Ly9jYWNlcnRzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydFRydXN0
# ZWRSb290RzQuY3J0MEMGA1UdHwQ8MDowOKA2oDSGMmh0dHA6Ly9jcmwzLmRpZ2lj
# ZXJ0LmNvbS9EaWdpQ2VydFRydXN0ZWRSb290RzQuY3JsMCAGA1UdIAQZMBcwCAYG
# Z4EMAQQCMAsGCWCGSAGG/WwHATANBgkqhkiG9w0BAQsFAAOCAgEAfVmOwJO2b5ip
# RCIBfmbW2CFC4bAYLhBNE88wU86/GPvHUF3iSyn7cIoNqilp/GnBzx0H6T5gyNgL
# 5Vxb122H+oQgJTQxZ822EpZvxFBMYh0MCIKoFr2pVs8Vc40BIiXOlWk/R3f7cnQU
# 1/+rT4osequFzUNf7WC2qk+RZp4snuCKrOX9jLxkJodskr2dfNBwCnzvqLx1T7pa
# 96kQsl3p/yhUifDVinF2ZdrM8HKjI/rAJ4JErpknG6skHibBt94q6/aesXmZgaNW
# hqsKRcnfxI2g55j7+6adcq/Ex8HBanHZxhOACcS2n82HhyS7T6NJuXdmkfFynOlL
# AlKnN36TU6w7HQhJD5TNOXrd/yVjmScsPT9rp/Fmw0HNT7ZAmyEhQNC3EyTN3B14
# OuSereU0cZLXJmvkOHOrpgFPvT87eK1MrfvElXvtCl8zOYdBeHo46Zzh3SP9HSjT
# x/no8Zhf+yvYfvJGnXUsHicsJttvFXseGYs2uJPU5vIXmVnKcPA3v5gA3yAWTyf7
# YGcWoWa63VXAOimGsJigK+2VQbc61RWYMbRiCQ8KvYHZE/6/pNHzV9m8BPqC3jLf
# BInwAM1dwvnQI38AC+R2AibZ8GV2QqYphwlHK+Z/GqSFD/yYlvZVVCsfgPrA8g4r
# 5db7qS9EFUrnEw4d2zc4GqEr9u3WfPwwggWNMIIEdaADAgECAhAOmxiO+dAt5+/b
# UOIIQBhaMA0GCSqGSIb3DQEBDAUAMGUxCzAJBgNVBAYTAlVTMRUwEwYDVQQKEwxE
# aWdpQ2VydCBJbmMxGTAXBgNVBAsTEHd3dy5kaWdpY2VydC5jb20xJDAiBgNVBAMT
# G0RpZ2lDZXJ0IEFzc3VyZWQgSUQgUm9vdCBDQTAeFw0yMjA4MDEwMDAwMDBaFw0z
# MTExMDkyMzU5NTlaMGIxCzAJBgNVBAYTAlVTMRUwEwYDVQQKEwxEaWdpQ2VydCBJ
# bmMxGTAXBgNVBAsTEHd3dy5kaWdpY2VydC5jb20xITAfBgNVBAMTGERpZ2lDZXJ0
# IFRydXN0ZWQgUm9vdCBHNDCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIB
# AL/mkHNo3rvkXUo8MCIwaTPswqclLskhPfKK2FnC4SmnPVirdprNrnsbhA3EMB/z
# G6Q4FutWxpdtHauyefLKEdLkX9YFPFIPUh/GnhWlfr6fqVcWWVVyr2iTcMKyunWZ
# anMylNEQRBAu34LzB4TmdDttceItDBvuINXJIB1jKS3O7F5OyJP4IWGbNOsFxl7s
# Wxq868nPzaw0QF+xembud8hIqGZXV59UWI4MK7dPpzDZVu7Ke13jrclPXuU15zHL
# 2pNe3I6PgNq2kZhAkHnDeMe2scS1ahg4AxCN2NQ3pC4FfYj1gj4QkXCrVYJBMtfb
# BHMqbpEBfCFM1LyuGwN1XXhm2ToxRJozQL8I11pJpMLmqaBn3aQnvKFPObURWBf3
# JFxGj2T3wWmIdph2PVldQnaHiZdpekjw4KISG2aadMreSx7nDmOu5tTvkpI6nj3c
# AORFJYm2mkQZK37AlLTSYW3rM9nF30sEAMx9HJXDj/chsrIRt7t/8tWMcCxBYKqx
# YxhElRp2Yn72gLD76GSmM9GJB+G9t+ZDpBi4pncB4Q+UDCEdslQpJYls5Q5SUUd0
# viastkF13nqsX40/ybzTQRESW+UQUOsxxcpyFiIJ33xMdT9j7CFfxCBRa2+xq4aL
# T8LWRV+dIPyhHsXAj6KxfgommfXkaS+YHS312amyHeUbAgMBAAGjggE6MIIBNjAP
# BgNVHRMBAf8EBTADAQH/MB0GA1UdDgQWBBTs1+OC0nFdZEzfLmc/57qYrhwPTzAf
# BgNVHSMEGDAWgBRF66Kv9JLLgjEtUYunpyGd823IDzAOBgNVHQ8BAf8EBAMCAYYw
# eQYIKwYBBQUHAQEEbTBrMCQGCCsGAQUFBzABhhhodHRwOi8vb2NzcC5kaWdpY2Vy
# dC5jb20wQwYIKwYBBQUHMAKGN2h0dHA6Ly9jYWNlcnRzLmRpZ2ljZXJ0LmNvbS9E
# aWdpQ2VydEFzc3VyZWRJRFJvb3RDQS5jcnQwRQYDVR0fBD4wPDA6oDigNoY0aHR0
# cDovL2NybDMuZGlnaWNlcnQuY29tL0RpZ2lDZXJ0QXNzdXJlZElEUm9vdENBLmNy
# bDARBgNVHSAECjAIMAYGBFUdIAAwDQYJKoZIhvcNAQEMBQADggEBAHCgv0NcVec4
# X6CjdBs9thbX979XB72arKGHLOyFXqkauyL4hxppVCLtpIh3bb0aFPQTSnovLbc4
# 7/T/gLn4offyct4kvFIDyE7QKt76LVbP+fT3rDB6mouyXtTP0UNEm0Mh65ZyoUi0
# mcudT6cGAxN3J0TU53/oWajwvy8LpunyNDzs9wPHh6jSTEAZNUZqaVSwuKFWjuyk
# 1T3osdz9HNj0d1pcVIxv76FQPfx2CWiEn2/K2yCNNWAcAgPLILCsWKAOQGPFmCLB
# sln1VWvPJ6tsds5vIy30fnFqI2si/xK4VC0nftg62fC2h5b9W9FcrBjDTZ9ztwGp
# n1eqXijiuZQxggN2MIIDcgIBATB3MGMxCzAJBgNVBAYTAlVTMRcwFQYDVQQKEw5E
# aWdpQ2VydCwgSW5jLjE7MDkGA1UEAxMyRGlnaUNlcnQgVHJ1c3RlZCBHNCBSU0E0
# MDk2IFNIQTI1NiBUaW1lU3RhbXBpbmcgQ0ECEAVEr/OUnQg5pr/bP1/lYRYwDQYJ
# YIZIAWUDBAIBBQCggdEwGgYJKoZIhvcNAQkDMQ0GCyqGSIb3DQEJEAEEMBwGCSqG
# SIb3DQEJBTEPFw0yNDA1MDEwNjE1MzFaMCsGCyqGSIb3DQEJEAIMMRwwGjAYMBYE
# FGbwKzLCwskPgl3OqorJxk8ZnM9AMC8GCSqGSIb3DQEJBDEiBCA7D/QKDEvh4LJZ
# 5XRKo59TKNywjijxYLWDQNSkxnFmHTA3BgsqhkiG9w0BCRACLzEoMCYwJDAiBCDS
# 9uRt7XQizNHUQFdoQTZvgoraVZquMxavTRqa1Ax4KDANBgkqhkiG9w0BAQEFAASC
# AgBvVx3K6jF6RTDK4Ptj0k//6RmxYFjHtgzpsS66Yz0P/ELl2SoHzqjN9TTLLuNr
# +6WtO2Ki15KOGfVjA7wJXaZ4/QpefKgdT0qjgFntOhyCnjLWfDZVfGdKXdK87r6T
# xg5Cju1ylvi269pPEwoliU5Cs0Vuek3zzk+taFYNk0cOklTM0yYOkbrjkHuFCbVV
# NQkuIrZyoK9PjHx2UfMkrucKVYAv1pXOum0+810BDX9PlH1G039enyrqSxCUYmC4
# EUM4VNNUhA5v0ufDyjNJ0qLqq3AylYkMctJfPAntKWHANexUSsjSSOrT1iW7PDyz
# Uzhxj9IINyKgUYBW0bd5LF4JEY1NQK2qeGKmAunOb2OehakcKAbvbbLnrOz7udUf
# MRw+UkjD2XVEQRQzFeqn46t74nVEtMgeV9Hnq/bmRGBALuL9re4O75e3VYG/9027
# L21iS9hrNqWg3k+KzbZB7SyMbZbshgkl90+FpDhaFGIG6sXpQHZSEOhARjICtzAy
# WHtEemW19oHTerQi+HVpBaE7kiBMMatdPvY9ksJG6mzAJ6T6OGiN7qR4tsARIsVz
# L3HucF5FE9AXBHJh0MD66QeV5/BEqow7JP2OOD/Me/EVnOZo4M6J6c3jwMGqwpal
# IjLyd/x7FebRe7WUmxqhJ4CreLpFSk2D3HLErf57KCLNKw==
# SIG # End signature block