From 68117ed658aa2a52fb9e314ee83d28250b9133f5 Mon Sep 17 00:00:00 2001 From: Robert Shaw <114415538+rsnm2@users.noreply.github.com> Date: Thu, 22 Dec 2022 12:53:17 -0500 Subject: [PATCH] Rs/logging feature (#130) * added logging README * Update logging.mdx * Update logging.mdx * added the logging README.mdx * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> * Update logging.mdx * Update logging.mdx * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update logging.mdx * updated logging * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update logging.mdx * cleaned up logging doc * fixed small bug * added to model card * goldfish.jpeg * restructured files * added example fn * added custom * updated deepsparse.logging > deepsparse.loggers * fixed * stash * Delete goldfish.jpeg * Add files via upload * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update logging.mdx * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> * Update src/files-for-examples/logging/example_custom_logger.py Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> * Update src/files-for-examples/logging/example_custom_fn.py Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> * Update src/content/user-guide/deepsparse-engine/logging.mdx Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> Co-authored-by: dbogunowicz <97082108+dbogunowicz@users.noreply.github.com> Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com> --- src/content/user-guide/deepsparse-engine.mdx | 4 + .../user-guide/deepsparse-engine/logging.mdx | 366 ++++++++++++++++++ .../logging/example_custom_fn.py | 10 + .../logging/example_custom_logger.py | 18 + src/files-for-examples/logging/goldfish.jpg | Bin 0 -> 6697 bytes .../server-config-with-custom-logger.yaml | 16 + .../logging/server-config.yaml | 56 +++ 7 files changed, 470 insertions(+) create mode 100644 src/content/user-guide/deepsparse-engine/logging.mdx create mode 100644 src/files-for-examples/logging/example_custom_fn.py create mode 100644 src/files-for-examples/logging/example_custom_logger.py create mode 100644 src/files-for-examples/logging/goldfish.jpg create mode 100644 src/files-for-examples/logging/server-config-with-custom-logger.yaml create mode 100644 src/files-for-examples/logging/server-config.yaml diff --git a/src/content/user-guide/deepsparse-engine.mdx b/src/content/user-guide/deepsparse-engine.mdx index 7280bdca973..4c7c1aaa06c 100644 --- a/src/content/user-guide/deepsparse-engine.mdx +++ b/src/content/user-guide/deepsparse-engine.mdx @@ -31,4 +31,8 @@ This user guide offers more information for exploring additional and advanced fu Explains how to use the numactl utility for controlling resource utilization with DeepSparse. + + + Explains how to use DeepSparse Logging for monitoring production models. + diff --git a/src/content/user-guide/deepsparse-engine/logging.mdx b/src/content/user-guide/deepsparse-engine/logging.mdx new file mode 100644 index 00000000000..1c911c28f58 --- /dev/null +++ b/src/content/user-guide/deepsparse-engine/logging.mdx @@ -0,0 +1,366 @@ +--- +title: "Logging" +metaTitle: "DeepSparse Logging" +metaDescription: "System and Data Logging with DeepSparse" +index: 6000 +--- + +# DeepSparse Logging + +This page explains how to use DeepSparse Logging to monitor your deployment. + +There are many types of monitoring tasks that you may want to perform to confirm your production system is working correctly. +The difficulty of the tasks varies from relatively easy (simple system performance analysis) to challenging +(assessing the accuracy of the system in the wild by manually labeling the input data distribution post-factum). Examples include: +- **System performance:** what is the latency/throughput of a query? +- **Data quality:** is there an issue getting data to my model? +- **Data distribution shift:** does the input data distribution deviates over time to the point where the model stops to deliver reliable predictions? +- **Model accuracy:** what is the percentage of correct predictions that a model achieves? + +DeepSparse Logging is designed to provide maximum flexibility for you to extract whatever data is needed from a +production inference pipeline into the logging system of your choice. + +## Installation + +This page requires the [DeepSparse Server Install](/get-started/install/deepsparse). + +## Metrics +DeepSparse Logging provides access to two types of metrics. + +### System Logging Metrics + +System Logging gives you access to granular performance metrics for quick and efficient diagnosis of system health. + +There is one group of System Logging Metrics currently available: Inference Latency. For each inference request, DeepSparse Server logs the following: +1. Pre-processing Time - seconds in the pre-processing step +2. Engine Time - seconds in the engine forward pass step +3. Post-processing Time - seconds in the post-processing step +4. Total Time - second for the end-to-end response time (sum of the prior three) + +### Data Logging Metrics + +Data Logging gives you access to data at each stage of an inference pipeline. +This facilitates inspection of the data, understanding of its properties, detecting edge cases, and possible data drift. + +There are four stages in the inference pipeline where Data Logging can occur: +- `pipeline_inputs`: raw input passed to the inference pipeline by the user +- `engine_inputs`: pre-processed tensors passed to the engine for the forward pass +- `engine_outputs`: result of the engine forward pass (e.g., the raw logits) +- `pipeline_outputs`: final output returned to the pipeline caller + +At each stage, you can specify functions to be applied to the data before logging. Example functions include the identity function +(for logging the raw input/output) or the mean function (e.g., for monitoring the mean pixel value of an image). + +There are three types of functions that can be applied to target data at each stage: +- Built-in functions: pre-written functions provided by DeepSparse ([see list on GitHub](https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/loggers/metric_functions/built_ins.py)) +- Framework functions: functions from `torch` or `numpy` +- Custom functions: custom user-provided functions + +## Configuration + +The YAML-based Server Config file is used to configure both System and Data Logging. +- System Logging is *enabled* by default. If no logger is specified, Python Logger is used. +- Data Logging is *disabled* by default. The config allows you to specify what data to log. + +See [the Server documentation](/user-guide/deploying-deepsparse/deepsparse-server) for more details on the Server config file. + +### Logging YAML Syntax + +There are two key elements that should be added to the Server Config to setup logging. + +First is `loggers`. This element configures the loggers that are used by the Server. Each element is a dictionary of the form `{logger_name: {arg_1: arg_value}}`. + +Second is `data_logging`. This element identifies which/how data should be logged for an endpoint. It is a dictionary of the form `{identifier: [log_config]}`. + +- `identifier` specifies the stages where logging should occur. It can either be a pipeline `stage` (see stages above) or `stage.property` if the data type +at a particular stage has a property. If the data type at a `stage` is a dictionary or list, you can access via slicing, indexing, or dict access, +for example `stage[0][:,:,0]['key3']`. + +- `log_config` specifies which function to apply, which logger(s) to use, and how often to log. It is a dictionary of the form +`{func: name, frequency: freq, target_loggers: [logger_names]}`. + +### Tangible Example +Here's an example for an image classification server: + +```yaml +# example-config.yaml +loggers: + python: # logs to stdout + prometheus: # logs to prometheus on port 6100 + port: 6100 + +endpoints: + - task: image_classification + route: /image_classification/predict + model: zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned95_quant-none + data_logging: + pipeline_inputs.images: # applies to the images (of the form stage.property) + - func: np.shape # framework function + frequency: 1 + target_loggers: + - python + + pipeline_inputs.images[0]: # applies to the first image (of the form stage.property[idx]) + - func: mean_pixels_per_channel # built-in function + frequency: 2 + target_loggers: + - python + - func: fraction_zeros # built-in function + frequency: 2 + target_loggers: + - prometheus + + engine_inputs: # applies to the engine_inputs data (of the form stage) + - func: np.shape # framework function + frequency: 1 + target_loggers: + - python +``` + +This configuration does the following data logging at each respective stage of the pipeline: +- System logging is enabled by default and logs to Prometheus and StdOut +- Logs the shape of the input batch provided by the user to stdout +- Logs the mean pixels and % of 0 pixels of the first image in the batch to Prometheus +- Logs the raw data and shape of the input passed to the engine to Python +- No logging occurs at any other pipeline stages + +## Loggers + +DeepSparse Logging includes options to log to Standard Output and to Prometheus out of the box as well as +the ability to create a Custom Logger. + +### Python Logger + +Python Logger logs data to Standard Output. It is useful for debugging and inspecting an inference pipeline. It +accepts no arguments and is configured with the following: + +```yaml +loggers: + python: +``` + +### Prometheus Logger + +DeepSparse is integrated with Prometheus, enabling you to easily instrument your model service. +The Prometheus Logger accepts some optional arguments and is configured as follows: + +```yaml +loggers: + prometheus: + port: 6100 + text_log_save_frequency: 10 # optional + text_log_save_dir: text/log/save/dir # optional + text_log_file_name: text_log_file_name # optional +``` + +There are four types of metrics in Prometheus (Counter, Gauge, Summary, and Histogram). DeepSparse uses +[Summary](https://prometheus.io/docs/concepts/metric_types/#summary) under the hood, so make sure the data you +are logging to Prometheus is an Int or a Float. + +### Custom Logger + +If you need a custom logger, you can create a class that inherits from the `BaseLogger` +and implements the `log` method. The `log` method is called at each pipeline stage and should handle exposing the metric to the Logger. + +```python +from deepsparse.loggers import BaseLogger +from typing import Any, Optional + +class CustomLogger(BaseLogger): + def log(self, identifier: str, value: Any, category: Optional[str]=None): + """ + :param identifier: The name of the item that is being logged. + By default, in the simplest case, that would be a string in the form + of "/" + e.g. "image_classification/pipeline_inputs" + :param value: The item that is logged along with the identifier + :param category: The metric category that the log belongs to. + By default, we recommend sticking to our internal convention + established in the MetricsCategories enum. + """ + print("Logging from a custom logger") + print(identifier) + print(value) +``` + +Once a custom logger is implemented, it can be referenced from a config file: + +```yaml +# server-config-with-custom-logger.yaml +loggers: + custom_logger: + path: example_custom_logger.py:CustomLogger + # arg_1: your_arg_1 + +endpoints: + - task: sentiment_analysis + route: /sentiment_analysis/predict + model: zoo:nlp/sentiment_analysis/bert-base/pytorch/huggingface/sst2/12layer_pruned80_quant-none-vnni + name: sentiment_analysis_pipeline + data_logging: + pipeline_inputs: + - func: identity + frequency: 1 + target_loggers: + - custom_logger +``` + +Download the following for an example of a custom logger: + +```bash +wget https://raw.githubusercontent.com/neuralmagic/docs/rs/logging-feature/src/files-for-examples/logging/example_custom_logger.py +wget https://raw.githubusercontent.com/neuralmagic/docs/rs/logging-feature/src/files-for-examples/logging/server-config-with-custom-logger.yaml +``` + +Launch the server: + +```bash +deepsparse.server --config-file server-config-with-custom-logger.yaml +``` + +Submit a request: + +```python +import requests +url = "http://0.0.0.0:5543/sentiment_analysis/predict" +obj = {"sequences": "Snorlax loves my Tesla!"} +resp = requests.post(url=url, json=obj) +print(resp.text) +``` + +You should see data printed to the Server's standard output. + +See [our Prometheus logger implementation](https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/loggers/prometheus_logger.py) +for inspiration on implementing a logger. + +## Usage + +DeepSparse Logging is currently supported for usage with DeepSparse Server. + +### Server Usage + +The Server startup CLI command accepts a YAML configuration file (which contains both logging-specific and general +configuration details) via the `--config-file` argument. + +Data Logging is configured at the endpoint level. The configuration file below creates a Server with two endpoints +(one for image classification and one for sentiment analysis): + +```yaml +# server-config.yaml +loggers: + python: + prometheus: + port: 6100 + +endpoints: + - task: image_classification + route: /image_classification/predict + model: zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned95_quant-none + name: image_classification_pipeline + data_logging: + pipeline_inputs.images: + - func: np.shape + frequency: 1 + target_loggers: + - python + + pipeline_inputs.images[0]: + - func: max_pixels_per_channel + frequency: 1 + target_loggers: + - python + - func: mean_pixels_per_channel + frequency: 1 + target_loggers: + - python + - func: fraction_zeros + frequency: 1 + target_loggers: + - prometheus + + pipeline_outputs.scores[0]: + - func: identity + frequency: 1 + target_loggers: + - prometheus + + - task: sentiment_analysis + route: /sentiment_analysis/predict + model: zoo:nlp/sentiment_analysis/bert-base/pytorch/huggingface/sst2/12layer_pruned80_quant-none-vnni + name: sentiment_analysis_pipeline + data_logging: + engine_inputs: + - func: example_custom_fn.py:sequence_length + frequency: 1 + target_loggers: + - python + - prometheus + + pipeline_outputs.scores[0]: + - func: identity + frequency: 1 + target_loggers: + - python + - prometheus +``` + +#### Custom Data Logging Function + +The example above included a custom function for computing sequence lengths. Custom +Functions should be defined in a local Python file. They should accept one argument +and return a single output. + +The `example_custom_fn.py` file could look like the following: + +```python +import numpy as np +from typing import List + +# Engine inputs to transformers is 3 lists of np.arrays representing +# the encoded input, the attention mask, and token types. +# Each of the np.arrays is of shape (batch, max_seq_len), so +# engine_inputs[0][0] gives the encodings of the first item in the batch. +# The number of non-zeros in this slice is the sequence length. +def sequence_length(engine_inputs: List[np.ndarray]): + return np.count_nonzero(engine_inputs[0][0]) +``` + +#### Launching the Server and Logging Metrics + +Download the `server-config.yaml`, `example_custom_fn.py`, and `goldfish.jpeg` for the demo. + +```bash +wget https://raw.githubusercontent.com/neuralmagic/docs/rs/logging-feature/src/files-for-examples/logging/server-config.yaml +wget https://raw.githubusercontent.com/neuralmagic/docs/rs/logging-feature/src/files-for-examples/logging/example_custom_fn.py +wget https://raw.githubusercontent.com/neuralmagic/docs/rs/logging-feature/src/files-for-examples/logging/goldfish.jpg + +``` + +Launch the Server with the following: + +```bash +deepsparse.server --config-file server-config.yaml +``` + +Submit a request to the image classification endpoint. + +```python +import requests +url = "http://0.0.0.0:5543/image_classification/predict/from_files" +paths = ["goldfish.jpg"] +files = [("request", open(img, 'rb')) for img in paths] +resp = requests.post(url=url, files=files) +print(resp.text) +``` + +Submit a request to the sentiment analysis endpoint with the following: + +```python +import requests +url = "http://0.0.0.0:5543/sentiment_analysis/predict" +obj = {"sequences": "Snorlax loves my Tesla!"} +resp = requests.post(url=url, json=obj) +print(resp.text) +``` + +You should see the metrics logged to the Server's standard output and to Prometheus (see at `http://localhost:6100` to quickly inspect the exposed metrics). diff --git a/src/files-for-examples/logging/example_custom_fn.py b/src/files-for-examples/logging/example_custom_fn.py new file mode 100644 index 00000000000..a10b1eb73b9 --- /dev/null +++ b/src/files-for-examples/logging/example_custom_fn.py @@ -0,0 +1,10 @@ +import numpy as np +from typing import List + +# Engine inputs to transformers are three lists of np.arrays representing +# the encoded input, the attention mask, and token types. +# Each of the np.arrays is of shape (batch, max_seq_len), so +# engine_inputs[0][0] gives the encodings of the first item in the batch. +# The number of non-zeros in this slice is the sequence length. +def sequence_length(engine_inputs: List[np.ndarray]): + return np.count_nonzero(engine_inputs[0][0]) \ No newline at end of file diff --git a/src/files-for-examples/logging/example_custom_logger.py b/src/files-for-examples/logging/example_custom_logger.py new file mode 100644 index 00000000000..9802a3d68ed --- /dev/null +++ b/src/files-for-examples/logging/example_custom_logger.py @@ -0,0 +1,18 @@ +from deepsparse.loggers import BaseLogger +from typing import Any, Optional + +class CustomLogger(BaseLogger): + def log(self, identifier: str, value: Any, category: Optional[str]=None): + """ + :param identifier: The name of the item that is being logged. + By default, in the simplest case, that would be a string in the form + of "/" + e.g., "image_classification/pipeline_inputs" + :param value: The item that is logged along with the identifier + :param category: The metric category that the log belongs to. + By default, we recommend sticking to our internal convention + established in the MetricsCategories enum. + """ + print("Logging from a custom logger") + print(identifier) + print(value) \ No newline at end of file diff --git a/src/files-for-examples/logging/goldfish.jpg b/src/files-for-examples/logging/goldfish.jpg new file mode 100644 index 0000000000000000000000000000000000000000..94e86b17d4140267a3c869069f3a381441695430 GIT binary patch literal 6697 zcmZu#2Q*w=*B)Jn7C~avh(sG+(R&TigBUFc(Iz#o9iX+SvO z8YvkK4J{1~^;OHjK*PXDLyJFY7-$(77;iB$++qN-@^IW@<_0n_af)zq^YRM{3No>a zNQwwZ@PGvQ2`DHisHv!#X=#~3-0b&2cnDoJ|3V1FkH5b<|9@P51<;WL>ImqG2zOTURygARMyu)B~IR&^*gnx&Qhz_6tI68e> zl$jo`{FwG%C=(JUC|*H{oEvZS4~RKZh;KY0S5)F5=OiE?e#gO)LK}KBl#nn)(SV$s z^XkEWkqQW<)x4%ePEG>gWlJF$RN~ML$^?1Vn7{c574{qmEB&_*jC>k?D!X_T4C%yhh-(zxe3f0he zw$D7`z@e+~vCzeBihDK{NXrWG9I%9fK~Hv-*YK|%Cs5y4krm|umnxFL>Fw2h=kltP z>|l(QOWuJ?-It@-GHZzXgZUFbK@Z$@;k`3f$VSPY^il9lnF^!!nK#D@XMC4{{V(v~ zu)<3K@YC`5&Lu#{Nd{zsT_F2jtX}P=SH2YWTf&GfQNh2Zm^v3^8--)R!FDLsJ`ckpTn&1ISxL3uppx~L<=8SIX zce9PJxOXE&ZpbMpi`GM4jTH&hfV0`=Xr=5%fao)1Y+=V@>ZadXif~@vY%!QEYtlwu zI|uB%zlTKV)C7pcL=Rr<#5y^(s(ncUAZLoa;6nA)&c3N9)Qx9vJ)j=c4s|0iby1hs zqW07+7J_CEbB-^xhJV{@nc+wB%hz{yy3=gwt`7`}g4Vq-%&EG?4n$#&L5+uu{M8FB7nEgx-1`+8koYf!T$?fuBs7 z*>X6ReuUo8sStzI*d?Gpuf=DdTVzc@B znAKT4Ia@J+bcEfz$UZR2*EuXek7F^9KI5|Ow@H`WT8u*o7};@6Mg#4#EyWH4}zH?VGi=>x$d-6P5x;ogi>D%oz`8}TJvZ@GM z)(gTpv9O}dngbE!i?!x40snkwS>$zBS}}U5!|ixV^?CM|_31UYfVVI!E935i^Ct&V z1sAzo8%}Brd zapv^tA>)x>%Voq}jz@hXi?L#{uI{rQ-7^zmEz6OD0~lx!BrXLrNK8pr=lK+v7s7m) z=rM?2{|pOK;9$gAN+;#r=WkIJr(FxDOE6fylqlyox;(9Chu*{h+Fyu!mdt!}Ni+Zq z3SiL+`7&5}*dy&C##}sBCERA8=&JBSbigr3q_IxwWf|x;7Bc;qfXjNd+j0UrwaD}l z)P`WO77UJClAH;H)3B?2Si|VK=G7TGqGL4g{T23pq4{>Tw)-|0>t}dJ`IikLDQe%E zkkM)OhpTgEzVzWHy1}*__w#!fkf>+FvW zaNpQ{p%N)FYgr2Li9xK`A z=b#Te+vqp*omecJneT5=&OTznzI*+?U#|a@YFujE#|oG)R>Mx3HmgdqdWWk9)-Gle zE%Q*c_?k(0HQD6c#2l&rEwpy8Uw%GMb#SloVguiT-yFR!AW*-6%aHBq%NCRsZXB4| zmws{EKWa>JaSX1&KE>Ka7KWk}ZJhBz<7CWz{mdfC%v~In8l64IU(OT z&J@%3PZ05$qsTjwF+~_Q>>bfnNQH!?X@C7`-in?AcO#Q78c`JOHcsQMj1z#u#br@O z7=7&2s#NLbLf`EbMsD@xRO%>UJ$-Qy3ysv*vsW2*{)UY7u+ZMqlb42zsTUh8;9KRm; z>G2ArsIBo94mdTta9aduI$ZlL1U~U~`ZR4uPC)z6QvUmeC6oSGZ=h%{+~LT5eb6tI z{BM%M!%0B=F8szje4fU0n-HZc65{gTW`Wpw%A7reT)RRV^=zS`^2N5{>?%E<8h+ps zWsA4T7LM7t(>}O00E5~PsEv;vVn=MRxy5xDE^qO1!-$?eiVfdgUGqx1819Du?ohFl zi;ZMKdE*qqgRIbUx}fXc(Tt|0gJQLgj0+&tZ&NO+tsOJVWqvbNumU$+S90ABnjt!r z<-ac%%st1h(_)7fG{d}lQqvFmg2`M694q=!q9@?zkM>rFfW@(Gr%wgKCK;KwFl zLtA_EIhW8yuXIM;Q(Up!z4}vYlkZVnaZQZfweMGq*_MODe#84Y2dbgWNw`O@GHwSZ zy9ac#DlTk@Izv#e%8fYEj34!`OpVjQ2+kq50^V*A7`?femdrQRbzH?SEAXWc|2Al#ydjeWgO>Bn-AbT?-GEVdv`W=CI@or-4kN}wC)oZhlJy*EdGrEPQ~3TQWttNw#fg!bjOi-@20v^C#=P!oUXTee$6#Gb3s11 zttW{mGeAP7UaDPxgql)Z%eLANGrH#XEdOZcY}t-{ zkh)ANUMJ1GZ`apfa=Y_cM=_R5#e8Fb#E-W#nqMYeqvWB3X8h;ou0BZA65-@+emZnt zHxdDQ87521?FplYatD|!7Lq1={4AVVKra@W6 zp63Nw&)@X`1{#~9P0Q1H*Kk@MlvYKzuRV~L3?VYLNMJXp0^7H$N&FT{8odNm*Zy4m z5aaxsWicS5J9Sothigx3T&yL~}bDkRtv9TG`$Ig011J1pE&V`-> z)LPZ>ch-I{=ujC>xszdm<&6#H>iIu$!q_0L<+G;#x+#6Vqa6?Vs@+qTYK+vTU_i`U z_$m+~Bp5SNYE$vaSG>7gZY&4gR{4RD@FfrDqbv3vPE33wBy5Q9Mz|ua=Jmdhe)HD7 zYsY+j3ruA)TF?X(>+42rLE*D4?k*?at*WMu_UW|m!ypWfVK{OnujocS9 zeJXicE)6#F4t@WlQlf$9Sk1lZxE``rITx9SiqBlJ0*#((3Hluc-8#GY{pJozr++ni z%ic`4aNTUUl}kJRu4u48#Zr0T?=iOi>&Y=JOHD-(hz0M++a@5KvDX9=C+WwkXOt|v zi9vNQq#>jRgL`VWrdjG2Q4+Q9F@ySPKgnWZ;*iR_9`>6nf%5SMYp0U&@|9(sJMim& zu~YXaxNL^({;KxG3Z3uz?Z+oGS>dQI0i6-P2MEMOo@Gqe)o1_cG57cMRG$AV0a~3D zLUN_G*t;Pl-9M70?}#_%g)2`dQ1-8e{A4trYw)6)`0~4JInFOMdd#?a{aQh|EEdI2 zx;a5(B$4@gv8R5eHfJDWt&8}1L!Q0z*0q#NfYJEy_@TNYvg;Zm=5E7u$#{F*?J!gA znPC~(u2bM~P2qr#KlR=1mfZoUZ^T6|1T#Ks>EpD+KDYIuc~{!l-n%eif?CmyvKm~! z{z=vHcnj;|u+JHZJ7f9FJXwI~vvPGt9u#wA-+_hE zNz9=+DnBq|(KyZ5%zA&ajybRgjks4I`Eh7IMi)0<8a)yz!zCB7v$6>()Qf`v5f2u} zR^O!K%3nJx(FKH=3F?tdjYW4YC)H%@z8|ypyu(lW21AFkCqfcMW;X0r*=L_4if7gu zd^_1MTGbGaFtuuCN16)_GWtV;l0) zV>wAIr-+QN!OK58OJa_Mun9g!zlW;836(%yS(O*kPcOcfnFh*Ok9$uWw5*_0AqCmC z)4HZvG$P&J(Uns(6*?w|Ayi>%ear;o14Y#aoElB&QLa~elG#*QSju?*ZC0P-Ppz@a zPQvs^S2C&1-gPUTYJeNSjldxXEJmYpbL#iZv58^Zt?@_m^MEAXEYH>QUzL`afQQx_ zi_!D>v$N3@Cn)Xp9A!4}9+d$jQQ(m?;xNpsTYu8uK3rD#^UG5MXq?D|u~PdREl zh_YZi=9!%&LQ3I_Tv^mfIT^=k$-*UIP>}u-z+JP(jullsE!r9jgzN7ai+${BH!u#m z!@`wmWj$0$nhvyIICLaaI%`0s`{al%;84H=Ys77;R?orJOTdh8J^K`9Pl3dF|MbLC z{$WEY{MPD9Vd8QJ5`WCp|I#q~aEI%0w}^;LdWzzo-JDK>PJ;5Ek^jvf@cRrN@(0T3 z?3`93dfwNj2CDG9+N_|M2rXyPam?aowf2!Y%0AYU5h+KwSLmI~c3{tI31_KM?NUds zMzt0aY+E+-Pn#64xG%j;&mJgduF3|9OpRN7ZI~fVmv;uy3%5ap>3jGbJmZACSJOhM z0n`9)2t%wAM5oZbi>lTj9NFH+Q; zgZkMMM^rrIwG-%>m^2VZRMehy>R&JHYyunid7M~vnatjkX>y z**Qac{gVB&TK?M*zq^U2{QpJhB1!wxomtY^rD}SN$JtW7T{oSy)3<0Sy0Ea_k~G{Z z1{o|>V%$xm&c-_}yA_cnJ1i**ac$Y_&ht1!K7}y$NtcMuj(!6+)xWT!^U{8`Wal_k zGRGh6(dl0gQz>W0Yo?y*RmPIG zL&;~^W%J-vw0{o<;5$9>2VIEkfeVBtryBIu+g9S6WVMfCQ;U;o=gqB~pL?+IMKn+y z!;IY(WfL?B3Gk5>SHk>${^h`HW6Ly6Qs-;p8T32sDK6=b7j<=~cIx*rJ@3M~RZy0g-Oq5#>T@vY{E#C5YO*|!Y^0NA zls?03n}zGS3L*vP?nyQTd)u@ZmD;AOIOlcnJuI;u^K?!-{97zmKftr=wsS_73HHI@m^>yu-Fgv#{bH zn6j`Ia9kI@vi_X?=8Bbk(dQ)NOm0}fK$D+4OQd46^cX*Ld1QRR)E?Gprh@kgL_?f# zHDUIrw6sgRje$rLREo%rMa9CEdtlCJdd5QU5}<`>xw&%xT2!sFfa@v0&$va8nv71P z7rG(ugfF$1I_-T4jO{xU(J1T?vqpUhSR5aFV)0=^%!pn)ON$oUH~4a?{Aq=(X8Jc_ zV4RUJDS`gevjp+qSkxrNcsGL~n^fzoG5-AkbKRNQ+axvU_fHOo)~OMT1NpeBeXmK^ zl3u${Z!$7B>m65vdXNuXUz<~d0>P26DYmb}qY@h#oCz;hC~5RN4M5HV)U_X>EG-_f z-juWj(yI1O;?yx6aI;1=lY{xJ2@snUMBiJ8Hc3r*WLS;GD4iG1jU}(%$l7bNCQrDVkA&QxCn08)0;W>4O5 zl3}|&>PK}|^4vFI)Z`D_mm)v&zg@~qe}rO) zVoUSW`P?}IGE z--%s|Bb1-AI!YNc9Azp;9L}4pNLW8_`9e*E?8Du%6V|D+{W|ln@+-b2P|kU z((p5pu9JxLh>&L2A>T1Sf`c*fM+~G5{nmAK*q*EXt}u37qSeBvdbzg49@VW)RX8RJ zU|~2-!9f~IBNZPhBtfN32Nwk52-h^EboH%2GuEFa2l9)pM)F5_=WB4c>+0v3c>;pg zKBwmCx7(-Rpdn}<@qAq6v>V$Nrg!gMjNQG($KDbYMj0+L)zpJ&p{Edd)Jn3`)SZfg tSGpo1=etAQEeybvU(AW45O