# Servicestack > slug: access-http-specific-features-in-services --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/access-http-specific-features-in-services.md --- slug: access-http-specific-features-in-services title: Access HTTP-specific Features in Services --- ServiceStack is based on [http handlers](http://msdn.microsoft.com/en-us/library/system.web.ihttphandler.aspx), but ServiceStack provides a clean, dependency-free [IService](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/IService.cs) to implement your Web Services logic in. The philosophy behind this approach is that the less dependencies you have on your environment and its request context, the more testable and re-usable your services become. ::: info The core [IRequest](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IRequest.cs) and [IResponse](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IResponse.cs) interfaces used in filters and Services ::: ### Request Filters The Request Filters are applied before the service gets called and accepts: (IRequest, IResponse, RequestDto) e.g: ```csharp //Add a request filter to check if the user has a session initialized this.RequestFilters.Add((httpReq, httpResponse, requestDto) => { httpReq.Headers["HttpHeader"]; httpReq.QueryString["queryParam"]; httpReq.Form["htmlFormParam"]; httpReq.GetParam("aParamInAnyOfTheAbove"); httpReq.Cookies["requestCookie"]; httpReq.AbsoluteUri; httpReq.Items["requestData"] = "Share data between Filters and Services"; //Access underlying Request in ASP.NET hosts var aspNetRequest = httpResponse.OriginalRequest as HttpRequestBase; //Access underlying Request in HttpListener hosts var listenerRequest = httpResponse.OriginalRequest as HttpListenerRequest; }); ``` #### Services When inheriting from Service you can access them via `base.Request` and `base.Response`: ```csharp public class MyService : Service { public object Any(Request request) { var value = base.Request.GetParam("aParamInAnyHeadersFormOrQueryString"); base.Response.AddHeader("X-CustomHeader", "Modify HTTP Response in Service"); } } ``` #### Response Filters The Response Filters are applied after your service is called and accepts: (IRequest, IResponse, ResponseDto) e.g Add a response filter to add a 'Content-Disposition' header so browsers treat it as a native .csv file: ```csharp this.ResponseFilters.Add((req, res, responseDto) => { if (req.ResponseContentType == ContentType.Csv) { res.AddHeader(HttpHeaders.ContentDisposition, $"attachment;filename={req.OperationName}.csv"); } //Access underlying Response in ASP.NET hosts var aspNetResponse = httpResponse.OriginalResponse as HttpResponseBase; //Access underlying Response in HttpListener hosts var listenerResponse = httpResponse.OriginalResponse as HttpListenerResponse; }); ``` ### Communicating throughout the Request Pipeline The recommended way to pass additional metadata about the request is to use the `IRequest.Items` collection. E.g. you can change what Razor View template the response DTO gets rendered in with: ```csharp httpReq.Items["Template"] = "_CustomLayout"; ... var preferredLayout = httpReq.Items["Template"]; ``` ## Advantages for having dependency-free services If you don't need to access the HTTP specific features your services can be called by any non-HTTP endpoint, like from a [message queue](/messaging). ### Injecting the IRequest into your Service Although working in a clean-room can be ideal from re-usability and testability point of view, you stand the chance of missing out a lot of the features present in HTTP. Just like using built-in Funq IOC container, the way to tell ServiceStack to inject the request context is by implementing the [IRequiresRequest](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IRequiresRequest.cs) interface which will get the [IRequest](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IRequest.cs) injected before each request. ::: info ServiceStack's Convenient `Service` base class already implements `IRequiresRequest` which allows you to access the `IRequest` with `base.Request` and the HTTP Response with `base.Response` ::: ::: info To return a customized HTTP Response, e.g. set Response Cookies or Headers, return the [HttpResult](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/HttpResult.cs) object ::: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/add-servicestack-reference.md --- slug: add-servicestack-reference title: Add ServiceStack Reference --- ServiceStack's **Add ServiceStack Reference** feature allows adding generated Native Types for the most popular typed languages and client platforms directly from within most major IDE's starting with [ServiceStackVS](/create-your-first-webservice#step-1-download-and-install-servicestackvs) - providing a simpler, cleaner and more versatile alternative to WCF's legacy **Add Service Reference** feature built into VS.NET. Add ServiceStack Reference now supports [C#](/csharp-add-servicestack-reference), [TypeScript](/typescript-add-servicestack-reference), [JavaScript](/javascript-add-servicestack-reference), [Python](/python-add-servicestack-reference), [PHP](/php-add-servicestack-reference), [Swift](/swift-add-servicestack-reference), [Java](/java-add-servicestack-reference), [Kotlin](/kotlin-add-servicestack-reference), [Dart](/dart-add-servicestack-reference), [F#](/fsharp-add-servicestack-reference), [VB.NET](/vbnet-add-servicestack-reference) and [ES3 Common.js](/commonjs-add-servicestack-reference) including integration with most leading IDE's to provide a flexible alternative than sharing your DTO assembly with clients. Clients can now easily add a reference to a remote ServiceStack url and update DTOs directly from within VS.NET, Xamarin Studio, Xcode, Android Studio, IntelliJ and Eclipse. We plan on expanding on this foundation into adding seamless, typed, end-to-end integration with other languages - Add a [feature request for your favorite language](https://servicestack.net/ideas) to prioritize support for it sooner! Native Types provides an alternative for sharing DTO dlls, that can enable a better dev workflow for external clients who are now able to generate (and update) Typed APIs for your Services from a single remote url directly within their favorite IDE - reducing the burden and effort required to consume ServiceStack Services whilst benefiting from clients native language strong-typing feedback. ### Supported Languages
C# TypeScript JavaScript Python PHP Swift Java Kotlin Dart F# VB.NET
![](./img/pages/add-ss-ref.svg) ### IDE Integration To provide a seamless Development Experience, Add ServiceStack Reference is available as a plugin in most major IDEs which will allow your API Consumers to be able easily add a typed Service Reference to your Services with just its URL in their preferred language from within JetBrains Rider, VS.NET, Android Studio, PyCharm, IntelliJ IDEA, RubyMine, PhpStorm, WebStorm and Eclipse: [![](./img/pages/servicestack-reference/ide-plugins-splash.png)](https://www.youtube.com/watch?v=JKsgrstNnYY) Here's a quick walk through installing the **ServiceStack** plugin and using it to add a remote ServiceStack Reference in a new C# Application: :::tip VSCode and other IDEs will be able to use the [Simple Command Line Utility](#simple-command-line-utilities) to add and update multiple Services references with a single command. ::: ### Call ServiceStack APIs from a Flutter App with native Dart client and DTOs Walk through showing how you can use ServiceStack's Dart client library with your Flutter Android application to quickly get up and running with Add ServiceStack Reference. ### C# Xamarin.Android Example in VS.NET Using C# to develop native Mobile and Desktop Apps provides a number of benefits including maximum reuse of your investments across multiple Client Apps where they're able to reuse shared functionality, libraries, knowledge, development workflow and environment in both Client and Server Apps. ### Call ServiceStack APIs from Python This video tutorial looks at how we can leverage Add ServiceStack Reference for Python in PyCharm, VSCode and [Python Jupyter Notebooks](/jupyter-notebooks-python). ### Call ServiceStack APIs from PHP This video tutorial looks at how we can easily integrate .NET Typed APIs to extend popular PHP Applications like Wordpress, Drupal or Laravel with [PHP Add ServiceStack Reference](/php-add-servicestack-reference). ### Instant Client Apps [Instant Client Apps](https://apps.servicestack.net/) is a free tool to jump start your native client application development using a wide range of languages and platforms including: C#, NodeJS, Dart, Java, Kotlin, Swift, VB .NET and F#: ## gRPC [ServiceStack gRPC](/grpc/) enables a highly productive development environment for developing high-performance gRPC HTTP/2 Services by making ServiceStack's existing typed Services available from ASP.NET's gRPC endpoints where ServiceStack offers a simplified development model for gRPC Clients for streamlined end-to-end productivity. ## C# Mobile and Desktop Apps [![](https://raw.githubusercontent.com/ServiceStackApps/HelloMobile/master/screenshots/splash-900.png)](https://github.com/ServiceStackApps/HelloMobile) The generated DTOs provides a highly productive development workflow and enables a succinct end-to-end Typed API that can be used in both **.NET Framework** and **.NET Standard 2.0** [Generic Service Clients](/csharp-client) to facilitate Rapid Development in .NET's most popular Mobile and Desktop platforms: - WPF - UWP - Xamarin.Android - Xamarin.iOS - Xamarin.OSX - Xamarin.Forms - iOS - Android - UWP The [HelloMobile](https://github.com/ServiceStackApps/HelloMobile) project contains multiple versions of the same App in all the above platforms demonstrating a number of different calling conventions, service integrations and reuse possibilities. ServiceStack also allows for the maximum reuse possible by letting you reuse the same POCO DTOs used to define the Services contract with, in Clients Apps to provide its end-to-end typed API without any additional custom build tools, code-gen or any other artificial machinery, using just the DTOs in the shared `ServiceModel.dll` with any of the available highly performant [.NET generic Service Clients](/csharp-client) that be design encourages development of [resilient message-based Services](/what-is-a-message-based-web-service) for enabling [highly decoupled](/service-gateway) and easily [substitutable and mockable](/csharp-client#built-in-clients) Service Integrations. ## Utilize Native SDKs and Languages Add ServiceStack Reference lets you utilize the native SDK's and development environment whilst maintaining the same productive development experience made possible with native idiomatic Service Clients in Web and across the most popular Mobile and Desktop platforms. App Developers can generate Typed DTOs for any ServiceStack Service in Android Apps using either [Java](/java-add-servicestack-reference) and [Kotlin](/kotlin-add-servicestack-reference), or use the [Swift](/swift-add-servicestack-reference) for development of native iOS or OSX Apps or [TypeScript](/typescript-add-servicestack-reference) for calling Services from [React Native, Node.js or Web Apps](https://github.com/ServiceStackApps/typescript-server-events). ### Flexible Customizations Options for the generated DTOs can be further customized by updating the commented section in the header of the file. Each language will have different options for leveraging features native to each Language. See the specific language documentation for details on available options: * [C# Options](/csharp-add-servicestack-reference#change-default-server-configuration) * [TypeScript Options](./typescript-add-servicestack-reference.md#customize-dto-type-generation) * [JavaScript Options](./javascript-add-servicestack-reference.md#customize-dto-type-generation) * [Python Options](./python-add-servicestack-reference.md#customize-dto-type-generation) * [Swift Options](/swift-add-servicestack-reference#swift-configuration) * [Java Options](/java-add-servicestack-reference#java-configuration) * [Kotlin Options](/kotlin-add-servicestack-reference#kotlin-configuration) * [Dart Options](/dart-add-servicestack-reference#change-default-server-configuration) * [F# Options](/fsharp-add-servicestack-reference#change-default-server-configuration) * [VB.Net Options](/vbnet-add-servicestack-reference) ## Simple command-line utilities The [x dotnet tool](/dotnet-tool) provides simple command-line utilities to easily Add and Update ServiceStack References for all of ServiceStack's supported languages. ## Installation :::sh dotnet tool install --global x ::: ::include npx-get-dtos.md:: This will make the following utilities available from your command-line which will let you download the Server DTO classes for a remote ServiceStack endpoint in your chosen language which you can use with ServiceStack's generic Service clients to be able to make end-to-end API calls.
Script Alias Language
x csharp x cs C#
x typescript x ts TypeScript
x mjs JavaScript
x python x py Python
x java Java
x kotlin x kt Kotlin
x swift Swift
x dart Dart
x vbnet x vb VB.NET
x fsharp x fs F#
## Usage We'll walkthrough an example using TypeScript to download Server Types from the [techstacks.io](https://techstacks.io) ServiceStack Website to see how this works: ### Adding a ServiceStack Reference To Add a TypeScript ServiceStack Reference just call `x typescript` with the URL of a remote ServiceStack instance: :::sh x typescript https://techstacks.io ::: Result: ``` Saved to: dtos.ts ``` Calling `x typescript` with just a URL will save the DTOs using the Host name, you can override this by specifying a FileName as the 2nd argument: :::sh x typescript https://techstacks.io Tech ::: Result: ``` Saved to: Tech.dtos.ts ``` ### Updating a ServiceStack Reference To Update an existing ServiceStack Reference, call `x typescript` with the Filename: :::sh x typescript dtos.ts ::: Result: ``` Updated: dtos.ts ``` Which will update the File with the latest TypeScript Server DTOs from [techstacks.io](https://techstacks.io). You can also customize how DTOs are generated by uncommenting the [TypeScript DTO Customization Options](/typescript-add-servicestack-reference#dto-customization-options) and updating them again. #### Updating all TypeScript DTOs Calling `x typescript` without any arguments will update **all TypeScript DTOs** in the current directory: :::sh x typescript ::: Result: ``` Updated: Tech.dtos.ts Updated: dtos.ts ``` To make it more wrist-friendly you can also use the shorter `x ts` alias instead of `x typescript`. ### Installing Generic Service Client Now we have our TechStacks Server DTOs we can use them with the generic `JsonServiceClient` in the [@servicestack/client](https://www.npmjs.com/package/@servicestack/client) npm package to make Typed API Calls. :::sh npm install @servicestack/client ::: #### TechStacks Example Once installed create a `demo.ts` file with the example below using both the `JsonServiceClient` from the **@servicestack/client** npm package and the Server DTOs we want to use from our local `dtos.ts` above: ```ts import { JsonServiceClient } from '@servicestack/client'; import { GetTechnology, GetTechnologyResponse } from './dtos'; var client = new JsonServiceClient("https://techstacks.io") async function main() { let request = new GetTechnology() request.Slug = "ServiceStack" const response = await client.get(request) console.log(response.Technology.VendorUrl) } main() ``` The `JsonServiceClient` is populated with the **BaseUrl** of the remote ServiceStack instance we wish to call. Once initialized we can send populated Request DTOs and handle the Typed Response DTOs in Promise callbacks. To run our TypeScript example we just need to compile it with TypeScript: :::sh tsc demo.ts ::: Which will generate the compiled `demo.js` (and `typescript.dtos.js`) which we can then run with node: :::sh node demo.js ::: Result: ``` https://servicestack.net ``` #### [Invoke ServiceStack APIs from the command-line](/post-command) Easily inspect and invoke C# .NET Web APIs from the command-line with Post Command which allows you to both inspect and call any ServiceStack API with just its name and a JS Object literal. API Responses returned in human-friendly markdown tables by default or optionally as JSON & raw HTTP. ### Built in Authentication One of the benefits of utilizing smart generic Service Clients is being able to embed high-level generic functionality like Authentication that would be tedious and error prone for all API Consumers to have to implement manually. All smart generic Service Clients have support for most of [built-in Authentication](/auth/authentication-and-authorization) options including [OAuth Providers](https://github.com/ServiceStackApps/AndroidJavaChat) and [Sign In with Apple](/auth/signin-with-apple) that are able to take advantage of the integrated and transparent JWT and Refresh Token Cookie support. ### Refresh Token Cookies supported in all Service Clients ::include jwt-service-clients.md:: ### Integrate with Visual Studio You can also easily integrate this within your VS.NET dev workflows by [adding it as an External Tool](https://docs.microsoft.com/en-us/visualstudio/ide/managing-external-tools?view=vs-2019) in the **External Tools** dialog box by choosing `Tools > External Tools`: ![](/img/pages/servicestack-reference/tool-ts-reference.png) | || |-|-| | Title | Update TypeScript &Reference | | Command | web.exe | | Arguments | ts | | Initial directory | $(ProjectDir) | | || Which will then let you update all your `*dtos.ts` TypeScript References in your project by clicking on `Tools > Update TypeScript Reference` or using the `ALT+T R` keyboard shortcut. If you wanted to Update your `*dtos.cs` **C# ServiceStack References** instead, just change Arguments to `cs`: ![](/img/pages/servicestack-reference/tool-cs-reference.png) | || |-|-| | **Title** | Update C# &Reference | | **Command** | web.exe | | **Arguments** | cs | | **Initial directory** | $(ProjectDir) | | || Refer to the [x usage output](#usage) above for the arguments or aliases for all other supported languages. ### Integrate with Rider Just like with VS.NET above you can [add an External Tool](https://www.jetbrains.com/help/rider/Settings_Tools_External_Tools.html) in [JetBrains Rider](https://www.jetbrains.com/rider/) by opening the Settings dialog with `CTRL+ALT+S` then searching for `external tools` under the **Tools** category: ![](/img/pages/servicestack-reference/rider-tool-ts-reference.png) | || |-|-| | **Name** | Update TypeScript Reference | | **Command** | web.exe | | **Arguments** | ts | | **Working directory** | $FileParentDir$ | | || Now you can update your `*dtos.ts` TypeScript References in your project by clicking on `External Tools > Update TypeScript Reference` in the right-click context menu: ![](/img/pages/servicestack-reference/rider-tool-ts-reference-run.png) If you're updating references frequently you can save time by [assigning it a keyboard shortcut](https://www.jetbrains.com/help/rider/Configuring_Keyboard_and_Mouse_Shortcuts.html). ## Multiple File Upload Support with API Requests supported in all languages To be able to call [AI Server](/ai-server/) APIs requiring file uploads we've added multiple file upload support with API Requests to the generic service clients for all our supported languages. Here's what that looks like for different languages calling AI Server's `SpeechToText` API: ### C# Speech to Text ```csharp using var fsAudio = File.OpenRead("audio.wav"); var response = client.PostFileWithRequest(new SpeechToText(), new UploadFile("audio.wav", fsAudio, "audio")); ``` ### Dart Speech to Text ```dart var audioFile = new File('audio.wav'); var uploadFile = new UploadFile( fieldName: 'audio', fileName: audioFile.uri.pathSegments.last, contentType: 'audio/wav', contents: await audioFile.readAsBytes() ); var response = await client.postFileWithRequest(new SpeechToText(), uploadFile); ``` ### Python Speech to Text ```python with open("files/audio.wav", "rb") as audio: response = client.post_file_with_request(SpeechToText(), UploadFile(field_name="audio", file_name="audio.wav", content_type="audio/wav", stream=audio)) ``` ### PHP Speech to Text ```php $audioFile = __DIR__ . '/files/audio.wav'; /** @var GenerationResponse $response */ $response = $client->postFileWithRequest(new SpeechToText(), new UploadFile( filePath: $audioFile, fileName: 'audio.wav', fieldName: 'audio', contentType: 'audio/wav' )); ``` ### Swift Speech to Text ```swift guard let audioURL = Bundle.module.url(forResource: "audio.wav", withExtension: nil) else { return } let audioData = try Data(contentsOf: audioURL) let response: TextGenerationResponse = try await client.postFileWithRequestAsync( request:SpeechToText(), file:UploadFile(fileName: "audio.wav", data:audioData, fieldName:"audio")) ``` ### Kotlin Speech to Text ```kotlin val audioBytes = Files.readAllBytes(Paths.get("audio.wav")) val response = client.postFileWithRequest(SpeechToText(), UploadFile("audio", "audio.wav", "audio/wav", audioBytes)) ``` ### Java Speech to Text ```java byte[] audioBytes = Files.readAllBytes(Paths.get("audio.wav")); var response = client.postFileWithRequest(request, new UploadFile("audio", "audio.wav", "audio/wav", audioBytes)); ``` ### TypeScript Speech to Text ```js // Create FormData and append the file const formData = new FormData() const audioFile = fs.readFileSync('audio.wav') const blob = new Blob([audioFile], { type: 'audio/wav' }) // Explicitly set the field name as 'audio' formData.append('audio', blob, 'audio.wav') const api = await client.apiForm(new SpeechToText(), formData) ``` ### Multiple File Uploads All languages also support a `postFilesWithRequest` variant for uploading multiple files with an API Request. E.g. here's an example of using `PostFilesWithRequest` to generate a video with a Watermark: ### C# Watermark Video ```csharp using var fsVideo = File.OpenRead("video.mp4"); using var fsWatermark = File.OpenRead("watermark.png"); var response = client.PostFilesWithRequest(new QueueWatermarkVideo { Position = WatermarkPosition.BottomRight }, [ new UploadFile("video.mp4", fsVideo, "video"), new UploadFile("watermark.png", fsWatermark, "watermark") ]); ``` ## Advantages over WCF - **Simple** Server provides DTOs based on metadata and options provided. No heavy client side tools, just a HTTP request! - **Versatile** Clean DTOs works in all JSON, XML, JSV, MsgPack and ProtoBuf [generic service clients](/csharp-client#built-in-clients) - **Reusable** Generated DTOs are not coupled to any endpoint or format. Defaults are both partial and virtual for maximum re-use - **Resilient** Messaging-based services offer a number of [advantages over RPC Services](/advantages-of-message-based-web-services) - **Flexible** DTO generation is customizable, Server and Clients can override built-in defaults - **Integrated** Rich Service metadata annotated on DTO's, [Internal Services](/auth/restricting-services) are excluded when accessed externally ## In Contrast with WCF's Add Service Reference WCF's **Add Service Reference** also allows generating a typed client from a single url, and whilst it's a great idea, the complexity upon what it's built-on and the friction it imposes were the primary reasons we actively avoided using it (pre-ServiceStack). We instead opted to reuse our server DTO types and created Generic WCF Proxies, to provide a cleaner and simpler solution when consuming our own WCF services. ## ServiceStack's Native Types Feature As with any ServiceStack feature one of our primary goals is to [minimize unnecessary complexity](/autoquery#why-not-complexity) by opting for approaches that yield maximum value and minimal complexity, favoring re-use and simple easy to reason about solutions over opaque heavy black-box tools. We can already see from the WCF scenario how ServiceStack already benefits from its message-based design, where as it's able to reuse any [Generic Service Client](/clients-overview), only application-specific DTO's ever need to be generated, resulting in a much cleaner, simpler and friction-less solution. Code-first is another approach that lends itself to simpler solutions, which saves the effort and inertia from adapting to interim schemas/specs, often with impedance mismatches and reduced/abstract functionality. In ServiceStack your code-first DTOs are the master authority where all other features are projected off. C# also has great language support for defining POCO Data Models, that's as terse as a DSL but benefits from great IDE support and minimal boilerplate, e.g: ```csharp [Route("/path")] public class Request : IReturn { public int Id { get; set; } public string Name { get; set; } ... } ``` Starting from a C# model, whilst naturally a better programmatic fit also ends up being richer and more expressive than XSD's which supports additional metadata annotations like Attributes and Interfaces. ### Remove Native Types Feature Native Types is enabled by default in ServiceStack projects. It can be disabled by removing the `NativeTypesFeature` plugin: ```csharp Plugins.RemoveAll(x => x is NativeTypesFeature); ``` ### Excluding Types from Add ServiceStack Reference To remove a type from the metadata and code generation you can annotate Request DTOs with `[ExcludeMetadata]`, e.g: ```csharp [ExcludeMetadata] public class ExcludedFromMetadata { public int Id { get; set; } } ``` An alternative is it add it to the `IgnoreTypes` collection in the NativeTypes Feature Metadata Config in your AppHost: ```csharp var nativeTypes = this.GetPlugin(); nativeTypes.MetadataTypesConfig.IgnoreTypes.Add(typeof(TypeToIgnore)); ``` If you only want to limit code generation based on where the reference is being added from you can use the [Restrict Attribute](/auth/restricting-services), E.g you can limit types to only appear when the reference is added from localhost: ```csharp [Restrict(LocalhostOnly = true)] public class RestrictedToLocalhost { } ``` Or when added from within an internal network: ```csharp [Restrict(InternalOnly = true)] public class RestrictedToInternalNetwork { } ``` There's also the rarer option when you only want a service accessible from external requests with: ```csharp [Restrict(ExternalOnly = true)] public class RestrictedToExternalRequests { } ``` ### Export Types By default the `NativeTypeFeature` doesn't emit any **System** types built into the Base Class Libraries, these can be emitted for non-.NET Languages with the new `ExportTypes` list, e.g. if your DTO's exposes the `DayOfWeek` System Enum it can be exported by adding it to the pre-registered NativeTypesFeature's Plugin with: ```csharp var nativeTypes = this.GetPlugin(); nativeTypes.MetadataTypesConfig.ExportTypes.Add(typeof(DayOfWeek)); ``` If any of your DTO's has a `DayOfWeek` property it will emitted in the generated DTO's, Java example: ```java public static enum DayOfWeek { Sunday, Monday, Tuesday, Wednesday, Thursday, Friday, Saturday; } ``` ### Force Include Types in Native Types DTOs ServiceStack's Add ServiceStack Reference feature carefully limits which DTOs it generates based on just the DTOs needed by different clients packages to call APIs. There's many reasons why Types aren't generated, e.g. they already exist in service client library, the APIs have [Visibility or Access restrictions](/auth/restricting-services), their built-in APIs purposefully hidden by ServiceStack to reduce bloat, etc. We can override these rules for specific Types by including them in `Metadata.ForceInclude`, e.g: ```csharp public override void Configure(Container container) { Metadata.ForceInclude = new() { typeof(MetadataApp), typeof(AppMetadata), typeof(AdminQueryUsers), typeof(AdminGetUser), typeof(AdminCreateUser), typeof(AdminUpdateUser), typeof(AdminDeleteUser), }; } ``` ### Enable Versioning You can implement our [recommended Versioning strategy](http://stackoverflow.com/a/12413091/85785) and embed a version number to all generated Request DTOs by specifying an `AddImplicitVersion`, either globally on the Server in your AppHost: ```csharp var nativeTypes = this.GetPlugin(); nativeTypes.MetadataTypesConfig.AddImplicitVersion = 1; ``` Alternatively you can configure [AddImplicitVersion in client Options](/csharp-add-servicestack-reference#addimplicitversion). ### Generating Types from Metadata Behind the scenes ServiceStack captures all metadata on your Services DTOs including Sub -classes, Routes, `IReturn` marker, C# Attributes, textual Description as well as desired configuration into a serializable object model accessible from `/types/metadata`: ## How it works The Add ServiceStack Reference dialog just takes the URL provided and requests the appropriate route for the current project. Eg, for C#, the path used is at `/types/csharp`. The defaults are specified by the server and the resultant DTOs are saved and added the the project as `.dtos.`. The `Update ServiceStack Reference` menu is available when any file matches same naming convention of `.dtos.`. An update then looks at the comments at the top of the file and parses them to provide overrides when requesting new DTOs from the server. ServiceStackVS also watches these DTO files for updates, so just by saving them these files are updated from the server. ### Language Paths | Path | Language | | -- | -- | | /types/csharp | C# | | /types/typescript | TypeScript | | /types/typescript.d | Ambient TypeScript Definitions | | /types/js | CommonJS | | /types/python | Python | | /types/swift | Swift | | /types/java | Java | | /types/kotlin | Kotlin | | /types/dart | Dart | | /types/fsharp | F# | | /types/vbnet | VB .NET | | /types/metadata | Metadata | ::include add-servicestack-reference-footer.md:: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/adhoc-utils.md --- slug: adhoc-utils title: Adhoc Utils --- ## Image Utils The `Image.ResizeToPng()` and `Image.CropToPng()` [extension methods](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/ImageExtensions.cs) can be used to resize and crop `System.Drawing` Images, e.g: ```csharp [AddHeader(ContentType = "image/png")] public Stream Get(Resize request) { var imageFile = VirtualFiles.GetFile(request.Path); if (imageFile == null) throw HttpError.NotFound(request.Path + " was not found"); using (var stream = imageFile.OpenRead()) using (var img = Image.FromStream(stream)) { return img.ResizeToPng(request.Width, request.Height); } } [AddHeader(ContentType = "image/png")] public Stream Get(Crop request) { var imageFile = VirtualFiles.GetFile(request.Path); if (imageFile == null) throw HttpError.NotFound(request.Path + " was not found"); using (var stream = imageFile.OpenRead()) using (var img = Image.FromStream(stream)) { return img.CropToPng(request.Width, request.Height, request.StartX, request.StartY); } } ``` ## Enum Utils The `EnumUtils.GetValues()`, `IEnumerable.ToKeyValuePairs()` and `Enum.ToDescription()` extension methods makes it easy to create data sources from Enums that can be annotated with `[ApiMember]` and `[Description]` attributes: ```csharp List> Titles => EnumUtils.GetValues() .Where(x => x != Title.Unspecified) .ToKeyValuePairs(); List<string> FilmGenres => EnumUtils.GetValues<FilmGenres>() .Map(x => x.ToDescription()); ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auth/admin-apikeys.md --- title: Simple Auth for .NET 10 Apps --- With ServiceStack now fully [integrated with ASP.NET Identity Auth](/auth/identity-auth), our latest [.NET 10 Tailwind Templates](/start) offer a full-featured Auth Configuration complete with User Registration, Login, Password Recovery, Two Factory Auth, and more. Whilst great for Web Applications that need it, it neglects the class of Apps which don't need User Auth and the additional complexity it brings inc. Identity and Password Management, EF Migrations, Token Expirations, OAuth Integrations, etc. For these stand-alone Apps, Microservices and Docker Appliances that would still like to restrict Access to their APIs but don't need the complexity of ASP .NET Core's Authentication machinery, a simpler Auth Story would be preferred. With the introduction of API Keys in this release we're able to provide a simpler Auth Story for .NET 10 Microservices that's easy for **Admin** Users to manage and control which trusted clients and B2B Integrations can access their functionality. :::youtube 0ceU91ZBhTQ Simple Auth Story with API Keys ideal for .NET 10 Microservices ::: The easiest way to get started is by creating a new Empty project with API Keys enabled with your preferred database to store the API Keys in. SQLite is a good choice for stand-alone Apps as it doesn't require any infrastructure dependencies. <div class="not-prose mx-auto"> <h3 id="template" class="mb-4 text-4xl tracking-tight font-extrabold text-gray-900"> Create a new Empty project with API Keys </h3> <auth-templates></auth-templates> </div> ### Existing Projects Existing projects not configured with Authentication can enable this simple Auth configuration by running: :::sh npx add-in apikeys-auth ::: Which will add the [ServiceStack.Server](https://nuget.org/packages/ServiceStack.Server) dependency and the [Modular Startup](/modular-startup) configuration below: ```csharp public class ConfigureApiKeys : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new AuthFeature([ new ApiKeyCredentialsProvider(), new AuthSecretAuthProvider("p@55wOrd"), ])); services.AddPlugin(new SessionFeature()); services.AddPlugin(new ApiKeysFeature { // Optional: Available Scopes Admin Users can assign to any API Key // Features = [ // "Paid", // "Tracking", // ], // Optional: Available Features Admin Users can assign to any API Key // Scopes = [ // "todo:read", // "todo:write", // ], }); }) .ConfigureAppHost(appHost => { using var db = appHost.Resolve<IDbConnectionFactory>().Open(); var feature = appHost.GetPlugin<ApiKeysFeature>(); feature.InitSchema(db); }); } ``` Which configures the **AuthSecretAuthProvider** with the **Admin** password and **ApiKeysFeature** to enable [API Keys](/auth/apikeys) support. ### Admin UI The **Admin** password will give you access to the [Admin UI](/admin-ui) at: :::{.text-4xl .text-center .text-indigo-800} /admin-ui ::: ![](/img/pages/auth/simple/admin-ui-signin.png) ![](/img/pages/auth/simple/admin-ui-dashboard.png) ### API Keys Admin UI Clicking on **API Keys** menu item will take you to the API Keys Admin UI where you'll be able to create new API Keys that you can distribute to different API consumers you want to be able to access your APIs: ![](/img/pages/auth/simple/admin-ui-apikeys.png) The **ApiKeysFeature** plugin will let you control different parts of the UI, including what **Features** you want to assign to API Keys and what **Scopes** you want individual API Keys to be able to have access to. ```csharp services.AddPlugin(new ApiKeysFeature { Features = [ "Paid", "Tracking", ], Scopes = [ "todo:read", "todo:write", ], // ExpiresIn =[ // new("", "Never"), // new("30", "30 days"), // new("365", "365 days"), // ], // Hide = ["RestrictTo","Notes"], }); ``` Any configuration on the plugin will be reflected in the UI: ![](/img/pages/auth/simple/admin-ui-apikeys-new.png) The API Keys Admin UI also lets you view and manage all API Keys in your App, including the ability to revoke API Keys, extend their Expiration date as well as manage any Scopes and Features assigned to API Keys. ![](/img/pages/auth/simple/admin-ui-apikeys-edit.png) ### Protect APIs with API Keys You'll now be able to protect APIs by annotating Request DTOs with the `[ValidateApiKey]` attribute: ```csharp [ValidateApiKey] public class Hello : IGet, IReturn<HelloResponse> { public required string Name { get; set; } } ``` Which only allows requests with a **valid API Key** to access the Service. ### Scopes We can further restrict API access by assigning them a scope which will only allow access to Valid API Keys configured with that scope, e.g: ```csharp [ValidateApiKey("todo:read")] public class QueryTodos : QueryDb<Todo> { public long? Id { get; set; } public List<long>? Ids { get; set; } public string? TextContains { get; set; } } [ValidateApiKey("todo:write")] public class CreateTodo : ICreateDb<Todo>, IReturn<Todo> { [ValidateNotEmpty] public required string Text { get; set; } public bool IsFinished { get; set; } } [ValidateApiKey("todo:write")] public class UpdateTodo : IUpdateDb<Todo>, IReturn<Todo> { public long Id { get; set; } [ValidateNotEmpty] public required string Text { get; set; } public bool IsFinished { get; set; } } [ValidateApiKey("todo:write")] public class DeleteTodos : IDeleteDb<Todo>, IReturnVoid { public long? Id { get; set; } public List<long>? Ids { get; set; } } ``` ### Restrict To APIs Scopes allow for coarse-grained access control allowing a single scope to access a logical group of APIs. For more fine-grained control you can use **Restrict To APIs** to specify just the APIs an API Key can access: ![](/img/pages/auth/simple/admin-ui-apikeys-restrict-to.png) Unlike scopes which can access APIs with the **same scope** or **without a scope**, Valid API Keys configured with **Restrict To APIs** can only access those specific APIs. ### Features Features are user-defined strings accessible within your Service implementation to provide different behavior based on Features assigned to the API Key, e.g: ```csharp public object Any(QueryTodos request) { if (Request.GetApiKey().HasFeature("Paid")) { //... } } ``` ### Admin Only APIs For APIs that should only be accessible to Admin Users (using AuthSecret) use `[ValidateIsAdmin]`, e.g: ```csharp [ValidateIsAdmin] public class AdminResetTodos : IPost, IReturnVoid {} ``` ### API Explorer Support for API Keys is also integrated into the [API Explorer](/api-explorer) allowing users to use their API Keys to access API Key protected Services which are highlighted with a **Key** Icon: ![](/img/pages/auth/simple/apiexplorer-requires-apikey.png) Users can enter their API Key by clicking on the **Key** Icon in the top right, or the link in the Warning alert when trying to access an API Key protected Service: ![](/img/pages/auth/simple/apiexplorer-apikey-dialog.png) ## API Keys and Admin Secret Credentials Auth Provider The usability of Simple Admin API Keys is greatly improved with the `ApiKeyCredentialsProvider` which enables .NET Microservices to provide persistent UserSession-like behavior for API Keys and Admin Auth Secrets to enable a Credentials Auth implementation which users can use with their API Keys or Admin AuthSecret. When registered a **Credentials** SignIn dialog will appear for [ServiceStack Built-in UIs](https://servicestack.net/auto-ui) allowing users to Sign In with their **API Keys** or Admin **Auth Secret**. ![](/img/pages/auth/simple/ai-server-auth-apiexplorer.png) ### Session Auth with API Keys Behind the scenes this creates a Server [Auth Session](/auth/sessions) but instead of maintaining an Authenticated User Session it saves the API Key in the session then attaches the API Key to each request. This makes it possible to make API Key validated requests with just a session cookie instead of requiring resubmission of API Keys for each request. ### Secure .NET Microservices and Docker Appliances This is an ideal Auth Configuration for .NET Docker Appliances and Microservices like [AI Server](/ai-server/) that don't need the complexity of ASP .NET Core's Identity Auth machinery and just want to restrict access to their APIs with API Keys and restrict Admin functionality to Administrator's with an Auth Secret. The benefit of `ApiKeyCredentialsProvider` is that it maintains a persistent Session so that end users only need to enter their API Key a single time and they'll be able to navigate to all of AI Server's protected pages using their API Key maintained in their Server User Session without needing to re-enter it for each UI and every request. ### User Access with API Keys AI Server uses **API Keys** to restrict Access to their AI Features to **authorized Users** with Valid API Keys who are able to use its Built-in UIs for its AI Features with the Users preferred Name and issued API Key: ![](/img/pages/auth/simple/ai-server-auth-user.png) After signing in a single time they'll be able to navigate to any protected page and start using AI Server's AI features: ![](/img/pages/auth/simple/ai-server-auth-user-chat.png) ### User Access to API Explorer This also lets users use their existing Auth Session across completely different UIs like [API Explorer](/api-explorer) where they'll have the same access to APIs as they would when calling APIs programatically with their API Keys, e.g: ![](/img/pages/auth/simple/ai-server-auth-apiexplorer-api.png) ## Admin Access AI Server also maintains an Admin UI and Admin APIs that are only accessible to **Admin** users who Authenticate with the App's configured Admin Auth Secret who are able to access AI Server's Admin UIs to monitor Live AI Requests, create new User API Keys, Manage registered AI Providers, etc. ![](/img/pages/auth/simple/ai-server-auth-admin-jobs.png) ### Admin Restricted APIs You can restrict APIs to Admin Users by using `[ValidateAuthSecret]`: ```csharp [Tag(Tags.Admin)] [ValidateAuthSecret] [Api("Add an AI Provider to process AI Requests")] public class CreateAiProvider : ICreateDb<AiProvider>, IReturn<IdResponse> { //... } ``` Which are identified in API Explorer with a **padlock** icon whilst APIs restricted by API Key are identified with a **key** icon: ![](/img/pages/auth/simple/ai-server-auth-apiexplorer-admin.png) ### Client Usage All HTTP and existing [Service Clients](https://docs.servicestack.net/clients-overview) can be configured to use API Keys for machine-to-machine communication, which like most API Key implementations can be passed in a [HTTP Authorization Bearer Token](https://datatracker.ietf.org/doc/html/rfc6750#section-2.1) that can be configured in Service Clients with: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = apiKey }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = apiKey ``` ### API Key HTTP Header Alternatively, API Keys can also be passed in the `X-Api-Key` HTTP Header which allows clients to be configured with an alternative Bearer Token allowing the same client to call both **Authenticated** and **API Key** protected APIs, e.g: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = AuthSecret, Headers = { [HttpHeaders.XApiKey] = apiKey } }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = AuthSecret client.headers.set('X-Api-Key', apiKey) ``` ## Development You can avoid having to re-renter AuthSecret and API Keys during Development by populating every request with the configured Admin AuthSecret which allows you to call both `[ValidateApiKey]` and `[ValidateIsAdmin]` protected APIs: ```csharp #if DEBUG PreRequestFilters.Add((req, res) => { req.Items[Keywords.AuthSecret] = authSecret; req.Items[Keywords.Authorization] = "Bearer " + authSecret; }); #endif ``` ### Summary We hope this shows how stand-alone .NET 10 Microservices and self-contained Docker Apps can use the simple **Admin** and **API Keys** configuration to easily secure their APIs, complete with **Management UI** and **typed Service Client** integrations. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-analytics.md --- title: Admin UI Analytics for SQLite --- Comprehensive API Analytics is available to all ServiceStack Apps configured with [SQLite Request Logging](/sqlite-request-logs). :::youtube kjLcm1llC5Y In Depth and Interactive API Analytics available to all ASP .NET Core ServiceStack Apps! ::: ### Benefits of API Analytics They provide deep and invaluable insight into your System API Usage, device distribution, its Users, API Keys and the IPs where most traffic generates: - **Visibility:** Provides a clear, visual summary of complex log data, making it easier to understand API usage and performance at a glance. - **Performance Monitoring:** Helps track key metrics like request volume and response times to ensure APIs are meeting performance expectations. - **User Understanding:** Offers insights into how users (and bots) are interacting with the APIs (devices, browsers). - **Troubleshooting:** Aids in quickly identifying trends, anomalies, or specific endpoints related to issues. - **Resource Planning:** Understanding usage patterns helps in scaling infrastructure appropriately. - **Security Insight:** Identifying bot traffic and unusual request patterns can be an early indicator of security concerns. - **Interactive Analytics:** Analytics are also interactive where you're able to drill down to monitor the activity of individual APIs, Users, API Keys and IPs with links back to the request logs which the summary analytics are derived from. ### Getting Started As they offer significant and valuable insights they're now built into all new ASP.NET Core IdentityAuth templates, existing .NET 10 IdentityAuth templates can enable it with: :::sh npx add-in sqlitelogs ::: .NET 10 Templates that are not configured to use [Endpoint Routing](/endpoint-routing) and [ASP.NET Core IOC](/net-ioc) will need to explicitly register `SqliteRequestLogger` as a singleton dependency in addition to configuring it on the `RequestLogsFeature` plugin: ```csharp public class ConfigureRequestLogs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { var logger = new SqliteRequestLogger(); services.AddSingleton<IRequestLogger>(logger); services.AddPlugin(new RequestLogsFeature { RequestLogger = logger, EnableRequestBodyTracking = true, EnableErrorTracking = true }); services.AddHostedService<RequestLogsHostedService>(); if (context.HostingEnvironment.IsDevelopment()) { services.AddPlugin(new ProfilingFeature()); } }); } ``` ## Analytics Admin UI Once configured, [SQLite Request Logs](/sqlite-request-logs) enable a more feature rich Request Logging Admin UI which utilizes the full queryability of an AutoQueryGrid to filter, sort and export Request Logs. [![](/img/pages/admin-ui/sqlitelogs.webp)](/img/pages/admin-ui/sqlitelogs.webp) ### Rolling Monthly Request Logs Benefits of using SQLite includes removing load from your App's primary database and being able to create naturally scalable and isolated Monthly databases on-the-fly which allow requests to be easily archived into managed file storage instead of a singular growing database. ## Analytics Overview It also enables the new **Analytics** Admin UI in the sidebar which initially displays the API Analytics overview Dashboard: [![](/img/pages/admin-ui/analytics-apis1.webp)](/img/pages/admin-ui/analytics-apis1.webp) Different charts displayed on the dashboard include: ### Distribution Pie Charts Lets you quickly understand the composition of your user base and traffic sources and the distribution of users across different web browsers, device types, and to identify the proportion of traffic coming from automated bots. ### Requests per day Line Chart Lets you monitor API usage trends and performance over time. It tracks the total number of API requests and the average response time day-by-day. You can easily spot trends like peak usage hours/days, identify sudden spikes or drops in traffic, and correlate request volume with API performance which is crucial for capacity planning and performance troubleshooting. ### API tag groups Pie Chart Lets you understand the usage patterns across different functional categories of your APIs. By grouping API requests based on assigned tags (like Security, Authentication, User Management, Tech, etc.), you get a high-level view of which *types* of functionalities are most frequently used or are generating the most load. ### API Requests Bar Chart Lets you identify the most and least frequently used specific API endpoints which ranks individual API endpoints by the number of requests they receive. This helps pinpoint: - **Critical Endpoints:** The most heavily used APIs that require robust performance and monitoring. - **Optimization Targets:** High-traffic endpoints that could benefit from performance optimization. - **Underutilized Endpoints:** APIs that might be candidates for deprecation or require promotion. - **Troubleshooting:** If performance issues arise (seen in the line chart), this helps narrow down which specific endpoint might be responsible. [![](/img/pages/admin-ui/analytics-apis2.webp)](/img/pages/admin-ui/analytics-apis2.webp) ### Total Duration Bar Chart Identifies which API endpoints consume the most *cumulative processing time* over the selected period. Even if an API endpoint is relatively fast per call, if it's called extremely frequently, it can contribute significantly to overall server load. Optimizing these can lead to significant savings in server resources (CPU, memory). ### Average Duration Bar Chart Pinpoints which API endpoints are the slowest on a *per-request* basis. APIs at the top of this list are prime candidates for performance investigation and optimization, as they represent potential user-facing slowness or system bottlenecks. ### Requests by Duration Ranges Histogram Provides an overview of the performance distribution for *all* API requests. This chart shows how many requests fall into different speed buckets and helps you understand the overall responsiveness of your API system at a glance. ## Individual API Analytics Clicking on an API's bar chart displays a dedicated, detailed view of a single API endpoint's behavior, isolating its performance and usage patterns from the overall system metrics offering immediate insight into the endpoint's traffic volume and reliability. [![](/img/pages/admin-ui/analytics-api.webp)](/img/pages/admin-ui/analytics-api.webp) ### Total Requests Displays the total requests for an API during the selected month. It includes HTTP Status Breakdown which provide **direct access to the filtered request logs**. This is a major benefit for **rapid troubleshooting**, allowing you to instantly view the specific log entries corresponding to successful requests or particular error codes for this API. ### Last Request Information Provides immediate context on the most recent activity for this endpoint with *when* the last request occurred, the source **IP address** and device information to help understand recent usage and check if the endpoint is still active, or quickly investigate the very last interaction if needed. ### Duration Summary Table (Total, Min, Max) Quantifies the performance characteristics specifically for this endpoint with the cumulative (Total) processing load, the best-case performance (Min), and the worst-case performance (Max) which is useful for identifying performance outliers. ### Duration Requests Histogram Visualizes the performance distribution for this API. ### Top Users Bar Chart Identifies which authenticated users are most frequently calling this API and relies on this endpoint the most. This can be useful for identifying power users, potential API abuse by a specific user account, or understanding the impact of changes to this API on key users. ### Top IP Addresses Bar Chart Shows which source IP addresses are generating the most traffic for this API. Useful for identifying high-volume clients, specific servers interacting with this endpoint, or potentially malicious IPs. ## Users The **Users** tab will display the top 100 Users who make the most API Requests and lets you click on a Users bar chart to view their individual User analytics. [![](/img/pages/admin-ui/analytics-users.webp)](/img/pages/admin-ui/analytics-users.webp) ### Individual User Analytics Provides a comprehensive view of a single user's complete interaction history and behavior across all APIs they've accessed, shifting the focus from API performance to user experience and activity. [![](/img/pages/admin-ui/analytics-user.webp)](/img/pages/admin-ui/analytics-user.webp) ### User Info & Total Requests Identifies the user and quantifies their overall activity level. Clicking on their ID or Name will navigate to the Users Admin UI. It also shows their success/error rate via the clickable status code links. This helps gauge user engagement and baseline activity. ### Last Request Information Offers a snapshot of the user's most recent interaction for immediate context. Knowing **when**, **what** API they called, from which **IP address**, using which **client** & **device** is valuable for support, identifying their last action or checking recent activity. ### HTTP Status Pie Chart Visualizes the overall success and error rate specifically for this user's API requests. ### Performance & Request Body Summary Table Quantifies the performance experienced by this user and the data they typically send. ### Duration Requests Histogram Shows the distribution of response times for requests made by this user to help understand the typical performance this user experiences. ### Top APIs Bar Chart Reveals which API endpoints this user interacts with most frequently and help understanding user behavior and which features they use most. ### Top IP Addresses Bar Chart Identifies the primary network locations or devices the user connects from. ### User Admin UI Analytics To assist in discoverability a snapshot of a Users Analytics is also visible in the Users Admin UI: [![](/img/pages/admin-ui/analytics-user-adminui.webp)](/img/pages/admin-ui/analytics-user-adminui.webp) Clicking on **View User Analytics** takes you to the Users Analytics page to access to the full Analytics features and navigation. ## API Keys The **API Keys** tab will display the top 100 API Keys who make the most API Requests and lets you click on an API Key bar chart to view its individual API Key analytics. [![](/img/pages/admin-ui/analytics-apikeys.webp)](/img/pages/admin-ui/analytics-apikeys.webp) ### Individual API Key Analytics Provides comprehensive API Key analytics Similar to User Analytics but limited to the API Usage of a single API Key: [![](/img/pages/admin-ui/analytics-apikey.webp)](/img/pages/admin-ui/analytics-apikey.webp) ## IPs The **IP Addresses** tab will display the top 100 IPs that make the most API Requests. Click on an IP's bar chart to view its individual analytics made from that IP Address. [![](/img/pages/admin-ui/analytics-ips.webp)](/img/pages/admin-ui/analytics-ips.webp) ### Individual IP Analytics Provides comprehensive IP Address analytics Similar to User Analytics but limited to the API Usage from a single IP Address: [![](/img/pages/admin-ui/analytics-ip.webp)](/img/pages/admin-ui/analytics-ip.webp) ## Blocking User Agents The insights from the Analytics showed us that our [pvq.app](https://pvq.app) was experiencing significant load from AI bots and scrapers which was the primary cause of its high resource usage and detrimental load times for normal user requests, so much so we've intervened to prevent these bots from scraping our site. ### Disallowing Bots in robots.txt In an ideal world you would just need to instruct problematic bots not to scrape your site by adding them to [pvq.app/robots.txt](https://pvq.app/robots.txt), e.g: ```txt User-agent: Googlebot Allow: / User-agent: Bingbot Allow: / User-agent: bytespider Disallow: / User-agent: gptbot Disallow: / User-agent: claudebot Disallow: / User-agent: amazonbot Disallow: / User-agent: mj12bot Disallow: / User-agent: semrushbot Disallow: / User-agent: dotbot Disallow: / User-agent: WhatsApp Bot Disallow: / User-agent: * Disallow: / ``` ### Disallowing Bot Requests As this was not having an immediate effect we took a more forceful approach to implement a middleware to reject all requests from disallowed bots from accessing our App which you can add to your own App with: :::sh npx add-in useragent-blocking ::: This will allow you to configure which Bot User Agents you want to reject from accessing your site, e.g: ```csharp services.Configure<UserAgentBlockingOptions>(options => { // Add user agents to block options.BlockedUserAgents.AddRange([ "bytespider", "gptbot", "gptbot", "claudebot", "amazonbot", "imagesiftbot", "semrushbot", "dotbot", "semrushbot", "dataforseobot", "WhatsApp Bot", "HeadlessChrome", "PetalBot", ]); // Optional: Customize the response status code // options.BlockedStatusCode = StatusCodes.Status429TooManyRequests; // Optional: Customize the blocked message options.BlockedMessage = "This bot is not allowed to access our website"; }); ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-database.md --- title: Database Admin --- The Database Admin UI lets you quickly browse and navigate your App's configured RDBMS schemas and tables: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="NZkeyuc_prg" style="background-image: url('https://img.youtube.com/vi/NZkeyuc_prg/maxresdefault.jpg')"></lite-youtube> It can be enabled by registering the `AdminDatabaseFeature` plugin from [ServiceStack.Server](https://nuget.org/packages/ServiceStack.Server): ```csharp services.AddPlugin(new AdminDatabaseFeature()); ``` Which without any additional configuration your App's configured databases will be listed on the home page, including their schemas, tables and any registered [named connections](/ormlite/getting-started#multiple-database-connections): ![](/img/pages/admin-ui/admin-ui-database.png) Selecting a table takes us to a familiar tabular search results grid, similar in appearance and functionality to [Locode's Auto UI](/locode/): ![](/img/pages/admin-ui/admin-ui-database-table.png) Whilst Locode gives you an entire Auto Management UI with all modifications performed through managed [AutoQuery APIs](/autoquery/), Database Admin instead focuses on providing a great readonly UX for querying & inspecting your App's data, starting with multiple views or quickly previewing every row in either **Pretty** JSON format: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/admin-ui-database-table-pretty.png"> </div> Where it will also let you copy every row in JSON format, whilst the **Preview** tab shows a friendlier view of the row's fields: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/admin-ui-database-table-preview.png"> </div> The tabular grid is highly personalizable where it lets change the query preferences and display fields for each table, where they're persisted in localStorage and preserved across browser restarts: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/admin-ui-database-prefs.png"> </div> Likewise so are the flexible filtering options allowing any number of filters per column: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/admin-ui-database-filter.png"> </div> The number and type of filters are readily available from the **Filters** dropdown showing all filters grouped under their column name where they're easily cleared per filter, column or using **Clear All** to clear all filters: ![](/img/pages/admin-ui/admin-ui-database-filters.png) After you've finished customizing your table search view, you can export the data with the **Excel** button to download the results in [CSV Format](/csv-format) where it can be opened in your favorite spreadsheet, e.g: ![](/img/pages/admin-ui/admin-ui-database-excel.png) Alternatively the **Copy URL** button can be used to generate the API data URL to return results in JSON: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/admin-ui-database-api-url.png"> </div> ## Database Admin Customizations Some customizations is available on the `AdminDatabaseFeature` plugin where you can control the maximum size of resultsets returned and you can use the `DatabaseFilter` to control which databases and schemas are displayed as well as changing the labels shown by setting their `Alias` properties, e.g: ```csharp Plugins.Add(new AdminDatabaseFeature { QueryLimit = 100, DatabasesFilter = dbs => { foreach (var db in dbs) { if (db.Name == "main") { db.Alias = "Northwind"; db.Schemas[0].Alias = "Traders"; } else if (db.Name == "chinook") { db.Alias = "Chinook"; db.Schemas[0].Alias = "Music"; } } }, }); ``` ## Feedback Welcome We hope you'll find the Database Admin feature useful, please let us know what other features you would like in [ServiceStack/Discuss](https://github.com/ServiceStack/Discuss/discussions). --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-features.md --- title: Admin UI Features --- Built into ServiceStack v6+ Apps is the [Admin UI](/admin-ui) providing **Admin** Users a UX Friendly UI to access App features & summary insights from: <div class="not-prose"> <h3 class="text-center font-medium text-4xl text-indigo-800 m-0 py-3">/admin-ui</h3> </div> Which after authenticating will take you to the Admin UI dashboard showing the authenticated Admin User details and general API stats: <div class="block p-4 rounded shadow"> <img src="/img/pages/admin-ui/dashboard.png"> </div> Further Admin UI functionality can be enabled by adding the necessary dependencies and Admin APIs necessary to implement the Admin UI Features. ### Disabling the Admin UI If desired, the **/admin-ui** features can be selectively or entirely disabled using the `AdminUi` Enum flags: ```csharp ConfigurePlugin<UiFeature>(feature => feature.AdminUi = AdminUi.None); ``` ## Admin Users User management functionality for creating & modifying users, assigning Roles & Permissions, locking users or updating their passwords can be enabled by registering `AdminUsersFeature` plugin: ```csharp Plugins.Add(new AdminUsersFeature()); ``` Which enables a familiar UI for searching & managing users: <div class="block p-4 rounded shadow"> <a href="/admin-ui-users"><img src="/img/pages/admin-ui/users.png"></a> </div> ::: info See [Admin UI User Docs](/admin-ui-users) to learn about Admin User features and available customization options ::: ## Redis Admin The [Redis Admin UI](/admin-ui-redis) lets you manage your App's configured Redis Server with a user-friendly UX for managing core Redis data types, simple search functionality to quickly find Redis values, quick navigation between related values, first class support for JSON values and a flexible command interface and command history to inspect all previously run redis commands. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="AACZtTOcQbg" style="background-image: url('https://img.youtube.com/vi/AACZtTOcQbg/maxresdefault.jpg')"></lite-youtube> It can be enabled by registering the `AdminRedisFeature` plugin: ```csharp services.AddPlugin(new AdminRedisFeature()); ``` Which will enable the **Redis** Admin UI: [![](/img/pages/admin-ui/admin-ui-redis.png)](/admin-ui-redis) ::: info See [Redis Admin docs](/admin-ui-redis) for more info. ::: ## Database Admin The [Database Admin UI](/admin-ui-database) lets you quickly browse and navigate your App's configured RDBMS schemas and tables: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="NZkeyuc_prg" style="background-image: url('https://img.youtube.com/vi/NZkeyuc_prg/maxresdefault.jpg')"></lite-youtube> It can be enabled by registering the `AdminDatabaseFeature` plugin from [ServiceStack.Server](https://nuget.org/packages/ServiceStack.Server): ```csharp services.AddPlugin(new AdminDatabaseFeature()); ``` Which will enable the **Database** Admin UI: [![](/img/pages/admin-ui/admin-ui-database.png)](/admin-ui-database) ::: info See [Database Admin docs](/admin-ui-database) for more info. ::: ## Request Logging & Profiling Enables invaluable observability into your App, from being able to quickly inspect and browse incoming requests, to tracing their behavior: :::sh npx add-in profiling ::: Which will add the [Modular Startup](/modular-startup) configuration to your Host project that registers both Request Logging & Profiling features when running your App in [DebugMode](/debugging#debugmode) (i.e. Development): ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureProfiling))] namespace MyApp; public class ConfigureProfiling : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { if (context.HostingEnvironment.IsDevelopment()) { services.AddPlugin(new ProfilingFeature { IncludeStackTrace = true, }); } }); } ``` Which will enable the Request Logging & Profiling UIs: <div class="block p-4 rounded shadow"> <a href="/admin-ui-profiling"><img src="/img/pages/admin-ui/admin-ui-logging.png"></a> </div> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="LgQHTSHSk1g" style="background-image: url('https://img.youtube.com/vi/LgQHTSHSk1g/maxresdefault.jpg')"></lite-youtube> ::: info See [Admin Logging & Profiling UI docs](/admin-ui-profiling) to learn about Admin Profiling feature and available customization options. ::: ## Validation The Admin Validation feature enables adding dynamically sourced validation rules that can be applied & modified at runtime. The most popular `IValidationSource` for maintaining dynamic validation rules is `OrmLiteValidationSource` for maintaining them in the App's registered database's `ValidationRule` RDBMS Table: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureValidation))] namespace MyApp; public class ConfigureValidation : IHostingStartup { // Add support for dynamically generated db rules public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => services.AddSingleton<IValidationSource>(c => new OrmLiteValidationSource(c.Resolve<IDbConnectionFactory>(), HostContext.LocalCache))) .ConfigureAppHost(appHost => { // Create `ValidationRule` table if it doesn't exist in AppHost.Configure() or Modular Startup appHost.Resolve<IValidationSource>().InitSchema(); }); } ``` Which can be quickly added to your project with: :::sh npx add-in validation-source ::: Which the built-in [Validation Feature](/validation.html#validation-feature) detects to register the `GetValidationRules` and `ModifyValidationRules` APIs used by the Admin Validation Feature: <div class="block p-4 rounded shadow"> <a href="/admin-ui-validation"><img src="/img/pages/admin-ui/validation.png"></a> </div> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="W5OJAlOxH98" style="background-image: url('https://img.youtube.com/vi/W5OJAlOxH98/maxresdefault.jpg')"></lite-youtube> ::: info See [Admin UI Validation Docs](/admin-ui-validation) to learn about dynamic DB Validation Rules ::: ## Recommend Admin UI Features The Admin UI was designed with room to grow. You can let us know what features you would find most valuable on our [GitHub Discussions](https://github.com/ServiceStack/Discuss/discussions/2). --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-identity-roles.md --- title: Identity Roles & Claims UI --- The Roles Admin UI is enabled when registering the [Admin Users UI](/admin-ui-identity-users#registration) which enables management APIs and Admin UIs for managing Identity Auth Roles and Claims for both Users and Roles. Once registered it will be available from the **Roles** menu item in the Admin UI sidebar which can be used Add and Remove Application Roles: ![](/img/pages/admin-ui/identityauth-roles.webp) ### Custom Application Roles If your App uses an extended `IdentityRole` data model, it can be configured with: ```csharp services.AddPlugin( new AuthFeature(IdentityAuth.For<ApplicationUser,ApplicationRole>(...))); ``` If it's also configured to use a different `PrimaryKey` type, it can be configured with: ```csharp services.AddPlugin( new AuthFeature(IdentityAuth.For<AppUser,AppRole,int>(...))); ``` ### IdentityAuth Role Claims The Edit Role Admin UI also supports Adding and Remove Claims for a Role, e.g: ![](/img/pages/admin-ui/identityauth-role-claims.webp) Any Added or Removed Claims are only applied after clicking **Update Role**, likewise you can exit the UI without applying any changes by clicking **Cancel**. ### Behavior of Role Claims Claims added to Roles have similar behavior to having Claims individually applied to all Users with that Role such that when a User is Authenticated they're populated with all claims assigned to their Roles and their individual User Claims. ## Validating Claims ::include admin-ui-claims-validation.md:: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-identity-users.md --- title: Identity User Admin Feature --- ::: info When using **ServiceStack Auth** refer to [Admin Users UI](/admin-ui-users) instead ::: ## Registration The Identity Auth Admin UI can be enabled when registering the `AuthFeature` Plugin by calling `AdminUsersFeature()`: ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new AuthFeature(IdentityAuth.For<ApplicationUser>( options => { // options.SessionFactory = () => new CustomUserSession(); //optional options.CredentialsAuth(); options.AdminUsersFeature(); }))); }); } ``` Which just like the ServiceStack Auth [Admin Users UI](/admin-ui-users) enables a Admin UI that's only accessible to **Admin** Users for managing **Identity Auth** users at `/admin-ui/users`. ## User Search Results Which displays a limited view of a User's info due to the minimal properties on the default `IdentityAuth` model: <div> <img class="shadow" src="/img/pages/auth/identity/admin-ui-users-default.png"> </div> ### Custom Search Result Properties These User Search results are customizable by specifying the `ApplicationUser` properties you want displayed instead: ```csharp options.AdminUsersFeature(feature => { feature.QueryIdentityUserProperties = [ nameof(ApplicationUser.Id), nameof(ApplicationUser.DisplayName), nameof(ApplicationUser.Email), nameof(ApplicationUser.UserName), nameof(ApplicationUser.LockoutEnd), ]; }); ``` <div> <img class="shadow" src="/img/pages/auth/identity/admin-ui-users-custom.png"> </div> ### Custom Search Result Behavior The default display Order of Users is also customizable: ```csharp feature.DefaultOrderBy = nameof(ApplicationUser.DisplayName); ``` As well as the Search behavior which can be replaced to search any custom fields, e.g: ```csharp feature.SearchUsersFilter = (q, query) => { var queryUpper = query.ToUpper(); return q.Where(x => x.DisplayName!.Contains(query) || x.Id.Contains(queryUpper) || x.NormalizedUserName!.Contains(queryUpper) || x.NormalizedEmail!.Contains(queryUpper)); }; ``` ## Default Create and Edit Users Forms The default Create and Edit Admin Users UI are also limited to editing the minimal `IdentityAuth` properties: <div> <img class="shadow" src="/img/pages/auth/identity/admin-ui-users-create.png"> </div> Whilst the Edit page includes standard features to lockout users, change user passwords and manage their roles: <div> <img class="shadow" src="/img/pages/auth/identity/admin-ui-users-edit.png"> </div> ### Custom Create and Edit Forms By default Users are locked out indefinitely, but this can also be changed to lock users out to a specific date, e.g: ```csharp feature.ResolveLockoutDate = user => DateTimeOffset.Now.AddDays(7); ``` The forms editable fields can also be customized to include additional properties, e.g: ```csharp feature.FormLayout = [ Input.For<ApplicationUser>(x => x.UserName, c => c.FieldsPerRow(2)), Input.For<ApplicationUser>(x => x.Email, c => { c.Type = Input.Types.Email; c.FieldsPerRow(2); }), Input.For<ApplicationUser>(x => x.FirstName, c => c.FieldsPerRow(2)), Input.For<ApplicationUser>(x => x.LastName, c => c.FieldsPerRow(2)), Input.For<ApplicationUser>(x => x.DisplayName, c => c.FieldsPerRow(2)), Input.For<ApplicationUser>(x => x.PhoneNumber, c => { c.Type = Input.Types.Tel; c.FieldsPerRow(2); }), ]; ``` That can override the new `ApplicationUser` Model that's created and any Validation: ### Custom User Creation ```csharp feature.CreateUser = () => new ApplicationUser { EmailConfirmed = true }; feature.CreateUserValidation = async (req, createUser) => { await IdentityAdminUsers.ValidateCreateUserAsync(req, createUser); var displayName = createUser.GetUserProperty(nameof(ApplicationUser.DisplayName)); if (string.IsNullOrEmpty(displayName)) throw new ArgumentNullException(nameof(AdminUserBase.DisplayName)); return null; }; ``` <div> <img class="shadow" src="/img/pages/auth/identity/admin-ui-users-create-custom.png"> </div> <div> <img class="py-8 px-12" src="/img/pages/auth/identity/admin-ui-users-edit-custom.png"> </div> ### Admin User Events Should you need to, Admin User Events can use used to execute custom logic before and after creating, updating and deleting users, e.g: ```csharp feature.OnBeforeCreateUser = (request, user) => { ... }; feature.OnAfterCreateUser = (request, user) => { ... }; feature.OnBeforeUpdateUser = (request, user) => { ... }; feature.OnAfterUpdateUser = (request, user) => { ... }; feature.OnBeforeDeleteUser = (request, userId) => { ... }; feature.OnAfterDeleteUser = (request, userId) => { ... }; ``` ### IdentityAuth User Claims The User Claim Management UI can be used to assign Claims to individual Users: ![](/img/pages/admin-ui/identityauth-user-claims.webp) ## Validating Claims ::include admin-ui-claims-validation.md:: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-profiling.md --- title: Logging & Profiling UI --- The Request Logging & Profiling UIs bring an invaluable new level of observability into your App, from being able to quickly inspect and browse incoming requests, to tracing their behavior from their generated events in the [Diagnostic Source](https://docs.microsoft.com/en-us/dotnet/api/system.diagnostics.diagnosticsource?view=net-6.0) capabilities added all throughout ServiceStack, which both power the new UIs and enables new introspectability from code where you can now to tap in to inspect & debug when each diagnostic event occurs. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="LgQHTSHSk1g" style="background-image: url('https://img.youtube.com/vi/LgQHTSHSk1g/maxresdefault.jpg')"></lite-youtube> The quickest way to enable access to these new features to your App is with: ::: sh npx add-in profiling ::: Which will add the [Modular Startup](/modular-startup) configuration to your Host project that registers both Request Logging & Profiling features when running your App in [DebugMode](/debugging#debugmode) (i.e. Development): ```csharp public class ConfigureProfiling : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { if (context.HostingEnvironment.IsDevelopment()) { services.AddPlugin(new ProfilingFeature { IncludeStackTrace = true, }); } }); } ``` Whilst Request Logs can be added with: ::: sh npx add-in requestlogs ::: Or if you prefer to store Request Logs in an SQLite database: ::: sh npx add-in sqlitelogs ::: Or to store Request Logs in PostgreSQL, SQL Server or MySql: ::: sh npx add-in db-requestlogs ::: The default configuration looks at providing useful information during development, where the response request bodies are captured in the Request Logger and the StackTrace is captured on the important events where they can be useful. ### Configuration Depending on your App you'll want to change these defaults, e.g. if you're persisting the request logs using the [CSV Request Logger](/request-logger#csv-request-logger) or [Redis Request Logger](/request-logger#redis-request-logger) it may not be feasible to **capture all API responses** if they're very large. If enabled, **StackTraces** are captured in these important events: - **ServiceStack:** Before a Service Gateway Request is sent - **OrmLite:** Opening a connection or rolling back a transaction - **Redis:** Opening a connection, auto retrying a failed command, renting & returning clients from a redis manager pool The latter of which is useful when resolving [concurrent usage issues](/redis/troubleshooting). As it adds overhead, profiling should only be added when used, e.g. during development or when needing to debug production issues. Although you may always want to capture request logs depending on how important it is to maintain an audit of completed API requests. Should it be needed, both Logging & Profiling plugins offer several configuration options to control the type, info & events captured. Whichever features you have registered will dynamically appear in the Admin UI's sidebar for quick navigation: ![](/img/pages/admin-ui/admin-ui-nav.png) ### Request Logging UI Clicking on **Logging** navigates to the Request Logging UI which displays each API request processed in a grid of useful summary information showing high-level information for each HTTP API request: ![](/img/pages/admin-ui/logging-splash.png) This screenshot shows an example of a non-authenticated user navigating to a protected page before signing in then querying and submitting a new Booking in the [AutoQuery CRUD Bookings Demo](/autoquery/crud-bookings) using [Locode's](/locode/) Auto UI, in which we can see error API Responses are highlighted in **red** and redirects highlighted in **yellow**. The top navigation controls which results are displayed with: - **Has Errors** - Show only requests with errors - **Has Response** - Show only requests with response bodies - **Reset Filters Icon** - Clear all filters (ESC) - **Left/Right Icons** - Navigate to previous/next pages (LEFT/RIGHT) - **Refresh Icon** - Reload the latest results This same user workflow is also captured in the Profiling UI in much finer granularity, capturing all the events performed by APIs: ![](/img/pages/admin-ui/profiling-splash.png) Clicking on an entry previews it in more detail, e.g. clicking on the first **/api/QueryBookings** will show the API Request and Response made: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/logging-QueryBookings.png"> </div> By default it shows the Request and Response DTOs in JSON, but clicking on preview often shows a more human-friendly view: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/logging-QueryBookings-preview.png"> </div> A useful feature from having a dedicated UX-friendly UI is enabling quick navigation where each **blue** link will display results filtered to all matching requests, whilst the **trace request** link will navigate to the Profiling UI showing all diagnostic events performed within that request. ### Inspect Cookies and JWT Tokens In addition to Request & Response DTOs, the Logging UI also shows all captured HTTP information including HTTP Request Headers with any Cookies being extracted into its own view for better readability as well as decoded JWT payload from the **ss-tok** cookie when using [JWT Auth](/auth/jwt-authprovider) with non-encrypted JWT defaults. <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/logging-http-details.png"> </div> Lets now take a look at the failed **CreateBooking** request to see what went wrong: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/logging-CreateBooking-validation.png"> </div> Ok, so the Admin User (identified from JWT info) tried to create an empty booking which was rejected by its server declarative validation rules which sees these context validation errors surfaced into Locode's UI: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/locode/CreateBooking-invalid.png"> </div> We can then see this was quickly rectified in the next request with a successful Booking submitted: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/logging-CreateBooking.png"> </div> Clicking on **trace request** we can see all the diagnostic events performed in this API request within the **RequestBefore** and **RequestAfter** events which took **0.07s** to complete. ## Profiling UI Following diagnostic conventions you'll typically see 2 entries for each event, created before and after each action, measuring the duration and capturing the last event before any error occurred. ![](/img/pages/admin-ui/profiling-CreateBooking-trace.png) ### SQL Profiling Clicking on an entry will show more useful contextual information captured for each event, e.g. if you click on OrmLite's **CommandAfter** event you'll be able to see the generated SQL + Params executed by OrmLite: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-CreateBooking-CommandAfter.png"> </div> The profiling detail view also contains **blue** links to filter matching diagnostic events and showing useful information like the **Thread**, **User** this command was executed by as well as the **duration** and **timestamp** when it occurred. ### Redis Profiling Redis simpler commands are captured in a list of arguments: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-redis-CommandAfter.png"> </div> ### Purchase API Events Example Surfacing the high-level events of your service implementations provides a new observability perspective that's harder to infer from trying to follow the details in the code. For example our [Order Page](https://account.servicestack.net/buy/BUS) generated over **150+ events** capturing all the SQL commands to store order, subscription, customer, payment information and generated License and Order confirmation emails, HttpClient integration requests with Stripe and MQ requests for sending emails in a [Background MQ Worker thread](/background-mq). ![](/img/pages/admin-ui/profiling-servicestack-buy1.png) ### HttpClient Profiling HttpClient profiling is implemented a little differently then other events in that it builds on the existing HttpClient diagnostic events so it's able to capture general usage of .NET's HttpClient, which is how it's able to capture our [integration with Stripe](/stripe): <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-servicestack-client-stripe.png"> </div> ### JsonApiClient Profiling Although we're able to provide richer profiling for our .NET 10+ [JsonApiClient](/csharp-client#jsonapiclient) which has access to typed Request DTOs for submitting API Requests: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-client-api-request.png"> </div> As well as Response DTOs returned in API Responses: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-client-api-response.png"> </div> We also can examine API Error responses in richer detail, e.g: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-client-api-error.png"> </div> ### MQ Profiling Since they execute APIs on an entirely different endpoint and worker threads, MQ Requests are tracked independently from HTTP APIs starting their own diagnostic Activity which enables being able to trace all events generated from an MQ Request. Here's an example used to handle sending customer emails: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-MqRequestBefore.png"> </div> ### Service Gateway Profiling The [Service Gateway](/service-gateway) leverages ServiceStack's message-based design to enable loosely-coupled service integrations enabling systems to split into Microservices without needing to change any of the internal services consuming them. As they're not RPC invocations their messages are introspectable and can be observed in the Profiling UI: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-gateway.png"> </div> ### Profile Custom Info We've made it easy to add a custom tracking field with the same functionality as the primary fields where they can be sorted and filtered. This could be used to attach a **Tenant Id** to the profiling information by providing a Label and Resolver function to resolve it, e.g: ```csharp new ProfilingFeature { TagLabel = "Tenant", TagResolver = req => MyResolveTenant(req), } ``` Where it will be displayed in all profiling results, e.g: ![](/img/pages/admin-ui/profiling-tenant-summary.png) ::: tip The number and order of fields can be customized in `SummaryFields` collection in `ProfilingFeature` ::: This custom info also appears in the detail page as a link which can be used to filter events with the same tenant id: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-tenant-detail.png"> </div> ### Profile Custom Metadata You're also able to capture custom information for different events and have them appear in the detail page, e.g: ```csharp new ProfilingFeature { DiagnosticEntryFilter = (entry, evt) => { if (evt is RequestDiagnosticEvent requestEvent) { var req = requestEvent.Request; entry.Meta = new() { ["RemoteIp"] = req.RemoteIp, ["Referrer"] = req.UrlReferrer?.ToString(), ["Language"] = req.GetHeader(HttpHeaders.AcceptLanguage), }; } }, } ``` Where it will be populated in the **Meta** section arguments: <div class="block flex justify-center items-center"> <img class="max-w-screen-md" src="/img/pages/admin-ui/profiling-custom-meta.png"> </div> ### Access Diagnostic Events in Code In addition to powering the profiling UI, the diagnostic events added throughout ServiceStack can be observed in code to tap in and inspect when these diagnostic events occur. It follows the standard Diagnostic Source model where you specify which listeners you want observed in `OnNext(DiagnosticListener)` that you can then access in `OnNext(KeyValuePair<string,object>)`. Microsoft's Diagnostic Events like HttpClient uses anonymous classes making them unnecessarily difficult to access, which can be made easier by using our [Reflection Utils ToObjectDictionary()](/reflection-utils#converting-instances-from-an-object-dictionary). As they offer better utility, we've opted to use idiomatic strong types and string constants instead where they're better accessible from C#. You can use this skeleton class for a quick way to get started showing how to subscribe to all ServiceStack Diagnostic Sources and the event names and types to handle all profiling events: ```csharp // Register your Diagnostic Observer var observer = new MyDiagnosticObserver(); var subscription = DiagnosticListener.AllListeners.Subscribe(observer); public sealed class MyDiagnosticObserver : IObserver<DiagnosticListener>, IObserver<KeyValuePair<string, object>> { private readonly List<IDisposable> subscriptions = new(); /* Specify which Profiling Events you want to observe */ void IObserver<DiagnosticListener>.OnNext(DiagnosticListener diagnosticListener) { if (diagnosticListener.Name is Diagnostics.Listeners.ServiceStack || diagnosticListener.Name is Diagnostics.Listeners.OrmLite || diagnosticListener.Name is Diagnostics.Listeners.Redis || diagnosticListener.Name is Diagnostics.Listeners.Client || diagnosticListener.Name is Diagnostics.Listeners.HttpClient) { var subscription = diagnosticListener.Subscribe(this); subscriptions.Add(subscription); } } /* Handle Profiling Events */ public void OnNext(KeyValuePair<string, object> kvp) { /** ServiceStack */ /*** Request */ if (kvp.Key == Diagnostics.Events.ServiceStack.WriteRequestBefore && kvp.Value is RequestDiagnosticEvent reqBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.ServiceStack.WriteRequestAfter && kvp.Value is RequestDiagnosticEvent reqAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.ServiceStack.WriteRequestError && kvp.Value is RequestDiagnosticEvent reqError) { /*...*/ } /*** Gateway */ if (kvp.Key == Diagnostics.Events.ServiceStack.WriteGatewayBefore && kvp.Value is RequestDiagnosticEvent gatewayBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.ServiceStack.WriteGatewayAfter && kvp.Value is RequestDiagnosticEvent gatewayAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.ServiceStack.WriteGatewayError && kvp.Value is RequestDiagnosticEvent gatewayError) { /*...*/ } /*** MQ */ if (kvp.Key == Diagnostics.Events.ServiceStack.WriteMqRequestBefore && kvp.Value is MqRequestDiagnosticEvent mqReqBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.ServiceStack.WriteMqRequestAfter && kvp.Value is MqRequestDiagnosticEvent mqReqAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.ServiceStack.WriteMqRequestError && kvp.Value is MqRequestDiagnosticEvent mqReqError) { /*...*/ } if (kvp.Key == Diagnostics.Events.ServiceStack.WriteMqRequestPublish && kvp.Value is MqRequestDiagnosticEvent mqReqPublish) { /*...*/ } /** Client */ if (kvp.Key == Diagnostics.Events.Client.WriteRequestBefore && kvp.Value is HttpClientDiagnosticEvent clientBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.Client.WriteRequestAfter && kvp.Value is HttpClientDiagnosticEvent clientAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.Client.WriteRequestError && kvp.Value is HttpClientDiagnosticEvent clientError) { /*...*/ } /** HttpClient */ if (kvp.Key == Diagnostics.Events.HttpClient.OutStart) { var obj = kvp.Value.ToObjectDictionary(); } if (kvp.Key == Diagnostics.Events.HttpClient.Request) { var obj = kvp.Value.ToObjectDictionary(); } if (kvp.Key == Diagnostics.Events.HttpClient.OutStop) { var obj = kvp.Value.ToObjectDictionary(); } if (kvp.Key == Diagnostics.Events.HttpClient.Response) { var obj = kvp.Value.ToObjectDictionary(); } /** OrmLite */ if (kvp.Key == Diagnostics.Events.OrmLite.WriteCommandBefore && kvp.Value is OrmLiteDiagnosticEvent dbBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteCommandAfter && kvp.Value is OrmLiteDiagnosticEvent dbAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteCommandError && kvp.Value is OrmLiteDiagnosticEvent dbError) { /*...*/ } /*** OrmLite Connections */ if (kvp.Key == Diagnostics.Events.OrmLite.WriteConnectionOpenBefore && kvp.Value is OrmLiteDiagnosticEvent dbOpenBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteConnectionOpenAfter && kvp.Value is OrmLiteDiagnosticEvent dbOpenAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteConnectionOpenError && kvp.Value is OrmLiteDiagnosticEvent dbOpenError) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteConnectionCloseBefore && kvp.Value is OrmLiteDiagnosticEvent dbCloseBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteConnectionCloseAfter && kvp.Value is OrmLiteDiagnosticEvent dbCloseAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteConnectionCloseError && kvp.Value is OrmLiteDiagnosticEvent dbCloseError) { /*...*/ } /*** OrmLite Transactions */ if (kvp.Key == Diagnostics.Events.OrmLite.WriteTransactionOpen && kvp.Value is OrmLiteDiagnosticEvent commitOpen) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteTransactionCommitBefore && kvp.Value is OrmLiteDiagnosticEvent commitBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteTransactionCommitAfter && kvp.Value is OrmLiteDiagnosticEvent commitAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteTransactionCommitError && kvp.Value is OrmLiteDiagnosticEvent commitError) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteTransactionRollbackBefore && kvp.Value is OrmLiteDiagnosticEvent rollbackBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteTransactionRollbackAfter && kvp.Value is OrmLiteDiagnosticEvent rollbackAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.OrmLite.WriteTransactionRollbackError && kvp.Value is OrmLiteDiagnosticEvent rollbackError) { /*...*/ } /** Redis */ if (kvp.Key == Diagnostics.Events.Redis.WriteCommandBefore && kvp.Value is RedisDiagnosticEvent redisBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteCommandRetry && kvp.Value is RedisDiagnosticEvent redisRetry) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteCommandAfter && kvp.Value is RedisDiagnosticEvent redisAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteCommandError && kvp.Value is RedisDiagnosticEvent redisError) { /*...*/ } /*** Redis Connections */ if (kvp.Key == Diagnostics.Events.Redis.WriteConnectionOpenBefore && kvp.Value is RedisDiagnosticEvent redisOpenBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteConnectionOpenAfter && kvp.Value is RedisDiagnosticEvent redisOpenAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteConnectionOpenError && kvp.Value is RedisDiagnosticEvent redisOpenError) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteConnectionCloseBefore && kvp.Value is RedisDiagnosticEvent redisCloseBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteConnectionCloseAfter && kvp.Value is RedisDiagnosticEvent redisCloseAfter) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WriteConnectionCloseError && kvp.Value is RedisDiagnosticEvent redisCloseError) { /*...*/ } /*** Redis Pools */ if (kvp.Key == Diagnostics.Events.Redis.WritePoolRent && kvp.Value is RedisDiagnosticEvent redisPoolBefore) { /*...*/ } if (kvp.Key == Diagnostics.Events.Redis.WritePoolReturn && kvp.Value is RedisDiagnosticEvent redisPoolAfter) { /*...*/ } } void IObserver<DiagnosticListener>.OnCompleted() { subscriptions.ForEach(x => x.Dispose()); subscriptions.Clear(); } public void OnCompleted() {} void IObserver<DiagnosticListener>.OnError(Exception error) {} public void OnError(Exception error) {} } ``` ### Request Logs Configuration The [Request Logs](/request-logger) feature has a number of configuration options controlling which requests are logged and the level of logging captured about them. ```csharp class RequestLogsFeature { // Limit API access to users in role string AccessRole = RoleNames.Admin; // RequestLogs service Route, default is /requestlogs string AtRestPath = "/requestlogs"; // Size of memory logger circular buffer int? Capacity; // Turn On/Off Session Tracking bool EnableSessionTracking; // Turn On/Off Logging of Raw Request Body, default is Off bool EnableRequestBodyTracking; // Turn On/Off Raw Request Body Tracking per-request Func<IRequest, bool> RequestBodyTrackingFilter; // Turn On/Off Tracking of Responses bool EnableResponseTracking = false; // Turn On/Off Tracking of Responses per-request Func<IRequest, bool> ResponseTrackingFilter; // Turn On/Off Tracking of Exceptions bool EnableErrorTracking = true; // Don't log matching requests Func<IRequest, bool> SkipLogging; // Change the RequestLogger provider. Default is InMemoryRollingRequestLogger IRequestLogger RequestLogger; // Don't log requests of these types. By default RequestLog's are excluded Type[] ExcludeRequestDtoTypes; // Don't log request body's for services with sensitive information. // By default Auth and Registration requests are hidden. Type[] HideRequestBodyForRequestDtoTypes; // Don't log Response DTO Types Type[] ExcludeResponseTypes; // Limit logging to only Service Requests bool LimitToServiceRequests = true; // Customize Request Log Entry Action<IRequest, RequestLogEntry> RequestLogFilter; // Ignore logging and serializing these Request DTOs List<Type> IgnoreTypes; = new(); // Use custom Ignore Request DTO predicate Func<object,bool> IgnoreFilter = DefaultIgnoreFilter; // Default take, if none is specified int DefaultLimit = 100; // Change what DateTime to use for the current Date (defaults to UtcNow) Func<DateTime> CurrentDateFn = () => DateTime.UtcNow; } ``` ### Profiling Configuration The `ProfilingFeature` offers similar functionality in specifying which sources to observe and profiling events to capture as well as options for customizing the Profiling UI, e.g you can limit generating & capturing diagnostic events to just [OrmLite](/ormlite/) and [Redis](/redis/) with: ```csharp Plugins.Add(new ProfilingFeature { Profile = ProfileSource.OrmLite | ProfileSource.Redis }); ``` For further configuration options see the documented plugin below: ```csharp [Flags] enum ProfileSource { None = 0, ServiceStack = 1 << 0, Client = 1 << 1, Redis = 1 << 2, OrmLite = 1 << 3, All = ServiceStack | Client | OrmLite | Redis, } class ProfilingFeature { // Limit API access to users in role string AccessRole = RoleNames.Admin; // Which features to Profile, default all ProfileSource Profile = ProfileSource.All; // Size of circular buffer of profiled events int Capacity = 10000; // Don't log requests of these types. By default Profiling/Metadata requests are excluded List<Type> ExcludeRequestDtoTypes = new(); // Don't log requests from these path infos prefixes List<string> ExcludeRequestPathInfoStartingWith = new(); // Turn On/Off Tracking of Responses per-request Func<IRequest, bool>? ExcludeRequestsFilter; // Don't log request body's for services with sensitive information. // By default Auth and Registration requests are hidden. List<Type> HideRequestBodyForRequestDtoTypes = new(); // Don't log Response DTO Types List<Type> ExcludeResponseTypes = new(); // Turn On/Off Tracking of Responses per-request Func<IRequest, bool>? ResponseTrackingFilter; // Whether to include CallStack StackTrace bool? IncludeStackTrace; // Attach custom data to request profiling summary fields Func<IRequest,string?>? TagResolver; // Label to show for custom tag string? TagLabel; // The properties displayed in Profiling UI results grid List<string> SummaryFields; // Default take, if none is specified int DefaultLimit = 50; // Customize DiagnosticEntry that gets captured Action<DiagnosticEntry, DiagnosticEvent>? DiagnosticEntryFilter; // Maximum char/byte length of string response body int MaxBodyLength = 10 * 10 * 1024; } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-rdbms-analytics.md --- title: Admin UI Analytics for RDBMS --- [ServiceStack v8.9](/releases/v8_09) restores parity to **PostgreSQL**, **SQL Server** & **MySQL** RDBMS's for our previous SQLite-only features with the new `DbRequestLogger` which is a drop-in replacement for [SQLite Request Logging](/sqlite-request-logs) for persisting API Request Logs to a RDBMS. Whilst maintaining an archive of API Requests is nice, the real value of DB Request Logging is that it unlocks the comprehensive API Analytics and querying Logging available that was previously limited to SQLite Request Logs. :::youtube kjLcm1llC5Y In Depth and Interactive API Analytics available to all ASP .NET Core ServiceStack Apps! ::: ### Benefits of API Analytics They provide deep and invaluable insight into your System API Usage, device distribution, its Users, API Keys and the IPs where most traffic generates: - **Visibility:** Provides a clear, visual summary of complex log data, making it easier to understand API usage and performance at a glance. - **Performance Monitoring:** Helps track key metrics like request volume and response times to ensure APIs are meeting performance expectations. - **User Understanding:** Offers insights into how users (and bots) are interacting with the APIs (devices, browsers). - **Troubleshooting:** Aids in quickly identifying trends, anomalies, or specific endpoints related to issues. - **Resource Planning:** Understanding usage patterns helps in scaling infrastructure appropriately. - **Security Insight:** Identifying bot traffic and unusual request patterns can be an early indicator of security concerns. ### Interactive Analytics Analytics are also interactive where you're able to drill down to monitor the activity of individual APIs, Users, API Keys and IPs which have further links back to the request logs which the summary analytics are derived from. As they offer significant and valuable insights the `SqliteRequestLogger` is built into all ASP.NET Core IdentityAuth templates, to switch it over to use a RDBMS we recommend installing `db-identity` mix gist to also replace SQLite BackgroundJobs with the RDBMS `DatabaseJobFeature`: :::sh npx add-in db-identity ::: Or if you just want to replace SQLite Request Logs with a RDBMS use: :::sh npx add-in db-requestlogs ::: Or you can copy the [Modular Startup](/modular-startup) script below: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureRequestLogs))] namespace MyApp; public class ConfigureRequestLogs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddPlugin(new RequestLogsFeature { RequestLogger = new DbRequestLogger { // NamedConnection = "<alternative db>" }, EnableResponseTracking = true, EnableRequestBodyTracking = true, EnableErrorTracking = true }); services.AddHostedService<RequestLogsHostedService>(); if (context.HostingEnvironment.IsDevelopment()) { services.AddPlugin(new ProfilingFeature()); } }); } public class RequestLogsHostedService(ILogger<RequestLogsHostedService> log, IRequestLogger requestLogger) : BackgroundService { protected override async Task ExecuteAsync(CancellationToken stoppingToken) { using var timer = new PeriodicTimer(TimeSpan.FromSeconds(3)); if (requestLogger is IRequireAnalytics logger) { while (!stoppingToken.IsCancellationRequested && await timer.WaitForNextTickAsync(stoppingToken)) { await logger.TickAsync(log, stoppingToken); } } } } ``` ### RDBMS Provider When using a remote RDBMS, network latency becomes a primary concern that any solution needs to be designed around, as such the API Request Logs are initially maintained in an in memory collection before being flushed to the database **every 3 seconds** — configurable in the `PeriodicTimer` interval above. To reduce the number of round-trips to the database, the `DbRequestLogger` batches all pending logs into a single request using [OrmLite's Bulk Inserts](/ormlite/bulk-inserts) which is supported by all major RDBMS's. ### PostgreSQL Table Partitioning PostgreSQL provides native support for table partitioning, allowing us to automatically create monthly partitions using `PARTITION BY RANGE` on the `CreatedDate` column. The `DbRequestLogger` automatically creates new monthly partitions as needed, maintaining the same logical separation as SQLite's monthly .db's while keeping everything within a single Postgres DB: ```sql CREATE TABLE "RequestLog" ( -- columns... "CreatedDate" TIMESTAMP NOT NULL, PRIMARY KEY ("Id","CreatedDate") ) PARTITION BY RANGE ("CreatedDate"); -- Monthly partitions are automatically created, e.g.: CREATE TABLE "RequestLog_2025_01" PARTITION OF "RequestLog" FOR VALUES FROM ('2025-01-01') TO ('2025-02-01'); ``` ### SQLServer / MySQL - Manual Partition Management For **SQL Server** and **MySQL**, monthly partitioned tables need to be created **out-of-band** (either manually or via cronjob scripts) since they don't support the same level of automatic partition management as PostgreSQL. However, this still works well in practice as because `RequestLog` is an **Append Only** table with all querying from the Admin UIs being filtered by its indexed `CreatedDate` in monthly viewable snapshots like it was with SQLite. ### Separate RequestLog Database Or if preferred, you can maintain request logs in a **separate database** from your main application database. This separation keeps the write-heavy logging load off your primary database, allowing you to optimize each database independently for its specific workload patterns like maintaining different backup strategies for your critical application data vs. log history. ```csharp // Configure.Db.cs services.AddOrmLite(options => options.UsePostgres(connectionString)) .AddPostgres("logs", logsConnectionString); // Configure.RequestLogs.cs services.AddPlugin(new RequestLogsFeature { RequestLogger = new DbRequestLogger { NamedConnection = "logs" }, //... }); ``` ## Queryable Admin Logging UI This will enable a more feature rich Request Logging Admin UI which utilizes the full queryability of the [AutoQueryGrid](/vue/autoquerygrid) component to filter, sort and export Request Logs. [![](/img/pages/admin-ui/sqlitelogs.webp)](/img/pages/admin-ui/sqlitelogs.webp) ## Analytics Overview Utilizing an `DbRequestLogger` also enables the **Analytics** Admin UI in the sidebar which initially displays the API Analytics Dashboard: :::{.wideshot} [![](/img/pages/admin-ui/analytics-apis1.webp)](/img/pages/admin-ui/analytics-apis1.webp) ::: ### Distribution Pie Charts Lets you quickly understand the composition of your user base and traffic sources and the distribution of users across different web browsers, device types, and to identify the proportion of traffic coming from automated bots. ### Requests per day Line Chart Lets you monitor API usage trends and performance over time. It tracks the total number of API requests and the average response time day-by-day. You can easily spot trends like peak usage hours/days, identify sudden spikes or drops in traffic, and correlate request volume with API performance which is crucial for capacity planning and performance troubleshooting. ### API tag groups Pie Chart Lets you understand the usage patterns across different functional categories of your APIs. By grouping API requests based on assigned tags (like Security, Authentication, User Management, Tech, etc.), you get a high-level view of which *types* of functionalities are most frequently used or are generating the most load. ### API Requests Bar Chart Lets you identify the most and least frequently used specific API endpoints which ranks individual API endpoints by the number of requests they receive. This helps pinpoint: - **Critical Endpoints:** The most heavily used APIs that require robust performance and monitoring. - **Optimization Targets:** High-traffic endpoints that could benefit from performance optimization. - **Underutilized Endpoints:** APIs that might be candidates for deprecation or require promotion. - **Troubleshooting:** If performance issues arise (seen in the line chart), this helps narrow down which specific endpoint might be responsible. :::{.wideshot} [![](/img/pages/admin-ui/analytics-apis2.webp)](/img/pages/admin-ui/analytics-apis2.webp) ::: ### Total Duration Bar Chart Identifies which API endpoints consume the most *cumulative processing time* over the selected period. Even if an API endpoint is relatively fast per call, if it's called extremely frequently, it can contribute significantly to overall server load. Optimizing these can lead to significant savings in server resources (CPU, memory). ### Average Duration Bar Chart Pinpoints which API endpoints are the slowest on a *per-request* basis. APIs at the top of this list are prime candidates for performance investigation and optimization, as they represent potential user-facing slowness or system bottlenecks. ### Requests by Duration Ranges Histogram Provides an overview of the performance distribution for *all* API requests. This chart shows how many requests fall into different speed buckets and helps you understand the overall responsiveness of your API system at a glance. ## Individual API Analytics Clicking on an API's bar chart displays a dedicated, detailed view of a single API endpoint's behavior, isolating its performance and usage patterns from the overall system metrics offering immediate insight into the endpoint's traffic volume and reliability. :::{.wideshot} [![](/img/pages/admin-ui/analytics-api.webp)](/img/pages/admin-ui/analytics-api.webp) ::: ### Total Requests Displays the total requests for an API during the selected month. It includes HTTP Status Breakdown which provide **direct access to the filtered request logs**. This is a major benefit for **rapid troubleshooting**, allowing you to instantly view the specific log entries corresponding to successful requests or particular error codes for this API. ### Last Request Information Provides immediate context on the most recent activity for this endpoint with *when* the last request occurred, the source **IP address** and device information to help understand recent usage and check if the endpoint is still active, or quickly investigate the very last interaction if needed. ### Duration Summary Table (Total, Min, Max) Quantifies the performance characteristics specifically for this endpoint with the cumulative (Total) processing load, the best-case performance (Min), and the worst-case performance (Max) which is useful for identifying performance outliers. ### Duration Requests Histogram Visualizes the performance distribution for this API. ### Top Users Bar Chart Identifies which authenticated users are most frequently calling this API and relies on this endpoint the most. This can be useful for identifying power users, potential API abuse by a specific user account, or understanding the impact of changes to this API on key users. ### Top IP Addresses Bar Chart Shows which source IP addresses are generating the most traffic for this API. Useful for identifying high-volume clients, specific servers interacting with this endpoint, or potentially malicious IPs. ## Users The **Users** tab will display the top 100 Users who make the most API Requests and lets you click on a Users bar chart to view their individual User analytics. :::{.wideshot} [![](/img/pages/admin-ui/analytics-users.webp)](/img/pages/admin-ui/analytics-users.webp) ::: ### Individual User Analytics Provides a comprehensive view of a single user's complete interaction history and behavior across all APIs they've accessed, shifting the focus from API performance to user experience and activity. :::{.wideshot} [![](/img/pages/admin-ui/analytics-user.webp)](/img/pages/admin-ui/analytics-user.webp) ::: ### User Info & Total Requests Identifies the user and quantifies their overall activity level. Clicking on their ID or Name will navigate to the Users Admin UI. It also shows their success/error rate via the clickable status code links. This helps gauge user engagement and baseline activity. ### Last Request Information Offers a snapshot of the user's most recent interaction for immediate context. Knowing **when**, **what** API they called, from which **IP address**, using which **client** & **device** is valuable for support, identifying their last action or checking recent activity. ### HTTP Status Pie Chart Visualizes the overall success and error rate specifically for this user's API requests. ### Performance & Request Body Summary Table Quantifies the performance experienced by this user and the data they typically send. ### Duration Requests Histogram Shows the distribution of response times for requests made by this user to help understand the typical performance this user experiences. ### Top APIs Bar Chart Reveals which API endpoints this user interacts with most frequently and help understanding user behavior and which features they use most. ### Top IP Addresses Bar Chart Identifies the primary network locations or devices the user connects from. ### User Admin UI Analytics To assist in discoverability a snapshot of a Users Analytics is also visible in the Users Admin UI: [![](/img/pages/admin-ui/analytics-user-adminui.webp)](/img/pages/admin-ui/analytics-user-adminui.webp) Clicking on **View User Analytics** takes you to the Users Analytics page to access to the full Analytics features and navigation. ## API Keys The **API Keys** tab will display the top 100 API Keys who make the most API Requests and lets you click on an API Key bar chart to view its individual API Key analytics. :::{.wideshot} [![](/img/pages/admin-ui/analytics-apikeys.webp)](/img/pages/admin-ui/analytics-apikeys.webp) ::: ### Individual API Key Analytics Provides comprehensive API Key analytics Similar to User Analytics but limited to the API Usage of a single API Key: :::{.wideshot} [![](/img/pages/admin-ui/analytics-apikey.webp)](/img/pages/admin-ui/analytics-apikey.webp) ::: ## IPs The **IP Addresses** tab will display the top 100 IPs that make the most API Requests. Click on an IP's bar chart to view its individual analytics made from that IP Address. :::{.wideshot} [![](/img/pages/admin-ui/analytics-ips.webp)](/img/pages/admin-ui/analytics-ips.webp) ::: ### Individual IP Analytics Provides comprehensive IP Address analytics Similar to User Analytics but limited to the API Usage from a single IP Address: :::{.wideshot} [![](/img/pages/admin-ui/analytics-ip.webp)](/img/pages/admin-ui/analytics-ip.webp) ::: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-redis.md --- title: Redis Admin --- The Redis Admin UI lets you manage your App's configured Redis Server with a user-friendly UX for managing core Redis data types, simple search functionality to quickly find Redis values, quick navigation between related values, first class support for JSON values and a flexible command interface and command history to inspect all previously run redis commands that's easily editable & rerun. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="AACZtTOcQbg" style="background-image: url('https://img.youtube.com/vi/AACZtTOcQbg/maxresdefault.jpg')"></lite-youtube> To add Redis support to your project: :::sh npx add-in redis ::: Or if you already have Redis configured, enable it by registering the `AdminRedisFeature` plugin: ```csharp services.AddPlugin(new AdminRedisFeature()); ``` ### Redis Stats on Dashboard The [Admin Dashboard](/admin-ui#dashboard) contains valuable insight into monitoring the health of your App's redis usage with both client & server counters: [![](/img/pages/admin-ui/admin-ui-redis-stats.png)](/admin-ui#dashboard) ::: tip A description of each of these stats is available in the [Redis Stats docs](/redis/stats) ::: ## Info The Redis Admin home page shows the output of the Redis [INFO](https://redis.io/commands/info/) command containing detailed information on the remote redis server: ![](/img/pages/admin-ui/admin-ui-redis.png) By default it uses the App's configured database but can easily switch between Redis databases with the numbered Database dropdown. ### Modify Redis Connection Changing your App's Redis Configuration at runtime can be enabled with: ```csharp Plugins.Add(new AdminRedisFeature { ModifiableConnection = true }); ``` Which will linkify the Redis Connection string to open the **Change Connection** Dialog: ![](/img/pages/admin-ui/admin-ui-redis-connection.png) Be aware this will change your App's Redis Connection at runtime to different redis server than what it was configured with, which can be useful if you have a warm stand-by Redis server you want to switch to without redeploying your App. ## Search The Search tab is where you'll find the primary functionality for being able to quickly search through the Redis keyspace to find where you can create or edit new Redis [Strings](https://redis.io/docs/data-types/strings/), [Lists](https://redis.io/docs/data-types/lists/), [Sets](https://redis.io/docs/data-types/sets/), [Sorted Sets](https://redis.io/docs/data-types/sorted-sets/) and [Hashes](https://redis.io/docs/data-types/hashes/): ![](/img/pages/admin-ui/admin-ui-redis-new.png) ### Set Selecting a Redis Data Type displays an optimized form you can use to create a new Value of that type: ![](/img/pages/admin-ui/admin-ui-redis-new-set.png) Which you can view in a **Pretty** view where collections like sets are displayed in a formatted JS Array letting you can copy all its values: ![](/img/pages/admin-ui/admin-ui-redis-set-pretty.png) A **Preview** mode displaying the results in a human-friendly table view: ![](/img/pages/admin-ui/admin-ui-redis-set-preview.png) And an **Edit** mode where you can **add** and **delete** members: ![](/img/pages/admin-ui/admin-ui-redis-set-edit.png) ### String The same functionality is available for all Data Types, whilst **Strings** contain first-class support for JSON strings in the **Pretty** tab: ![](/img/pages/admin-ui/admin-ui-redis-string-pretty.png) **Preview** ![](/img/pages/admin-ui/admin-ui-redis-string-preview.png) and **Edit** views where you can **indent JSON** when creating or editing JSON Strings: ![](/img/pages/admin-ui/admin-ui-redis-string-edit.png) ### Hash Hashes have the same functionality as **SET** with an additional field to capture the hash entries value: ![](/img/pages/admin-ui/admin-ui-redis-hash-edit.png) ### Sorted Set Whilst Sorted Sets maintains an extra numerical field to capture Sorted Set scores: ![](/img/pages/admin-ui/admin-ui-redis-zset-edit.png) ## Command The Command tab gives you a flexible Command bar letting you run custom Redis commands against the selected database, including a Command History capturing all previously run commands that can be reselected to quickly edit & rerun commands: ![](/img/pages/admin-ui/admin-ui-redis-command.png) By default Redis Admin blocks running dangerous and unsuitable commands from a Web interface which can be modified when registering the `AdminRedisFeature`, that by default prevents the commands below: ```csharp Plugins.Add(new AdminRedisFeature { IllegalCommands = { "BLMOVE", "BLMPOP", "BLPOP", "BRPOP", "BRPOPLPUSH", "FLUSHDB", "FLUSHALL", "MONITOR", } }) ``` ## Profile App Redis Usage The command history maintains a log for all commands executed in the Redis Admin UI, you can inspect the redis commands executed by your Services with the [Redis Profiling](/admin-ui-profiling#redis-profiling) built into the [Admin Profiling UI](/admin-ui-profiling). [![](/img/pages/admin-ui/profiling-redis-CommandAfter.png)](/admin-ui-profiling#redis-profiling) ## Feedback Welcome We hope you'll find the Redis Admin feature useful, please let us know what other features you would like in [ServiceStack/Discuss](https://github.com/ServiceStack/Discuss/discussions). --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-users.md --- title: User Admin Feature --- ::: info When using ASP.NET Core **Identity Auth** refer to [Identity Auth Admin Users UI](/admin-ui-identity-users) instead ::: The User Admin Plugin is a lightweight API for providing user management functionality around Auth Repository APIs and enables remote programmatic access to manage your registered [User Auth Repository](/auth/authentication-and-authorization#user-auth-repository), featuring: - Works with existing `IUserAuthRepository` sync or async providers - Utilizes Progressive enhancement, e.g. search functionality utilizes `IQueryUserAuth` (if exists) performing a wildcard search over multiple fields, otherwise falls back to exact match on `UserName` or `Email` - Supports managing Auth Repositories utilizing custom `UserAuth` data models - Flexible UI options for customizing which fields to include in Search Results and Create/Edit UIs - Rich Metadata aggregating only App-specific Roles & Permissions defined in your App - User Events allow you to execute custom logic before & after each Created/Updated/Deleted User ### Installation The `AdminUsersFeature` plugin contains no additional dependencies that at a minimum can be registered with: ```csharp Plugins.Add(new AdminUsersFeature()); ``` <div class="not-prose"> <a href="https://razor-pages.web-templates.io/admin-ui"> <h3 class="text-center font-medium text-3xl mb-3">/admin-ui/users</h3> <div class="block p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/admin-ui/users.png"> </div> </a> </div> ::: info An `IAuthRepository` is a required registered dependency to be able to use the `AdminUsersFeature` plugin. ::: ## Managing Users By default, the Add and Edit Users forms contains the default layout of common properties in [UserAuth.cs](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/UserAuth.cs) <div class="flex justify-center py-8"> <a href="https://razor-pages.web-templates.io/admin-ui/users?edit=2"> <img src="/img/pages/admin-ui/users-edit-default.png" style="max-width:800px;"> </a> </div> ## Customization To customize this user interface to accommodate custom properties, the `UserFormLayout` needs to be overridden. For example, below we have a custom `UserAuth` called `AppUser` with additional properties. ```csharp // Custom User Table with extended Metadata properties public class AppUser : UserAuth { public Department Department { get; set; } public string? ProfileUrl { get; set; } public string? LastLoginIp { get; set; } public bool IsArchived { get; set; } public DateTime? ArchivedDate { get; set; } public DateTime? LastLoginDate { get; set; } } public enum Department { None, Marketing, Accounts, Legal, HumanResources, } ``` The `AdminUsersFeature` has multiple fiends that can be used to customize the UI including. | Property Name | Description | |---------------------------|--------------------------------------------------------------------| | `QueryUserAuthProperties` | Columns visible in query results for users. | | `QueryMediaRules` | Which columns *start* appearing at different screen sizes. | | `UserFormLayout` | Control which fields are used for Create/Edit and their placement. | ### Custom User Form Layout Similar to the [API Explorer](./api-explorer.md#formlayout) `FormLayout` customization, `UserFormLayout` is used to control placement and details about individual fields. ```csharp appHost.Plugins.Add(new ServiceStack.Admin.AdminUsersFeature { // Show custom fields in Search Results QueryUserAuthProperties = new() { nameof(AppUser.Id), nameof(AppUser.Email), nameof(AppUser.DisplayName), nameof(AppUser.Department), nameof(AppUser.CreatedDate), nameof(AppUser.LastLoginDate), }, QueryMediaRules = new() { MediaRules.ExtraSmall.Show<AppUser>(x => new { x.Id, x.Email, x.DisplayName }), MediaRules.Small.Show<AppUser>(x => x.Department), }, // Add Custom Fields to Create/Edit User Forms FormLayout = new() { Input.For<AppUser>(x => x.Email, x => x.Type = Input.Types.Email), Input.For<AppUser>(x => x.DisplayName), Input.For<AppUser>(x => x.UserName), Input.For<AppUser>(x => x.Company, c => c.FieldsPerRow(2)), Input.For<AppUser>(x => x.Department, c => c.FieldsPerRow(2)), Input.For<AppUser>(x => x.PhoneNumber, c => c.Type = Input.Types.Tel), Input.For<AppUser>(x => x.Nickname, c => { c.Help = "Public alias (3-12 lower alpha numeric chars)"; c.Pattern = "^[a-z][a-z0-9_.-]{3,12}$"; }), Input.For<AppUser>(x => x.ProfileUrl, c => c.Type = Input.Types.Url), Input.For<AppUser>(x => x.IsArchived, c => c.FieldsPerRow(2)), Input.For<AppUser>(x => x.ArchivedDate, c => c.FieldsPerRow(2)), } }); ``` Enabling the use of custom properties as well as formatting for ease of use. `UserFormLayout` updates the `Create` and `Edit` screens in the Admin UI. <div class="flex justify-center py-8"> <a href="https://razor-pages.web-templates.io/admin-ui/users?edit=2"> <img src="/img/pages/admin-ui/users-edit-custom.png" style="max-width:800px;"> </a> </div> ## Admin User Services The Admin User back-end APIs themselves can also be used to manage users within your own Apps. All the Admin Users DTOs below contains everything needed to call its APIs from [.NET Service Clients](/csharp-client) which are all contained within **ServiceStack.Client** so no additional dependencies are needed. The APIs are fairly straight-forward with each DTO containing on the bare minimum Typed properties with all other UserAuth fields you want updated in the `UserAuthProperties` Dictionary. Whilst all User result-sets are returned in an unstructured Object Dictionary. ```csharp public abstract class AdminUserBase : IMeta { public string UserName { get; set; } public string FirstName { get; set; } public string LastName { get; set; } public string DisplayName { get; set; } public string Email { get; set; } public string Password { get; set; } public string ProfileUrl { get; set; } public Dictionary<string, string> UserAuthProperties { get; set; } public Dictionary<string, string> Meta { get; set; } } public partial class AdminCreateUser : AdminUserBase, IPost, IReturn<AdminUserResponse> { public List<string> Roles { get; set; } public List<string> Permissions { get; set; } } public partial class AdminUpdateUser : AdminUserBase, IPut, IReturn<AdminUserResponse> { public string Id { get; set; } public bool? LockUser { get; set; } public bool? UnlockUser { get; set; } public List<string> AddRoles { get; set; } public List<string> RemoveRoles { get; set; } public List<string> AddPermissions { get; set; } public List<string> RemovePermissions { get; set; } } public partial class AdminGetUser : IGet, IReturn<AdminUserResponse> { public string Id { get; set; } } public partial class AdminDeleteUser : IDelete, IReturn<AdminDeleteUserResponse> { public string Id { get; set; } } public class AdminDeleteUserResponse : IHasResponseStatus { public string Id { get; set; } public ResponseStatus ResponseStatus { get; set; } } public partial class AdminUserResponse : IHasResponseStatus { public string Id { get; set; } public Dictionary<string,object> Result { get; set; } public ResponseStatus ResponseStatus { get; set; } } public partial class AdminQueryUsers : IGet, IReturn<AdminUsersResponse> { public string Query { get; set; } public string OrderBy { get; set; } public int? Skip { get; set; } public int? Take { get; set; } } public class AdminUsersResponse : IHasResponseStatus { public List<Dictionary<string,object>> Results { get; set; } public ResponseStatus ResponseStatus { get; set; } } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui-validation.md --- title: Admin UI Validation --- The DB Validation feature leverages the existing [Declarative Validation](/declarative-validation) infrastructure where it enables dynamically managing Request DTO Type and Property Validators from a RDBMS data source which immediately takes effect at runtime and can be optionally cached where they'll only need to be re-hydrated from the database after modification. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="W5OJAlOxH98" style="background-image: url('https://img.youtube.com/vi/W5OJAlOxH98/maxresdefault.jpg')"></lite-youtube> This feature can be easily added to existing host projects with: :::sh npx add-in validation-source ::: Which will add the [Modular Startup](/modular-startup) validation configuration to your project, utilizing your existing configured database: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureValidation))] namespace MyApp; public class ConfigureValidation : IHostingStartup { // Add support for dynamically generated db rules public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => services.AddSingleton<IValidationSource>(c => new OrmLiteValidationSource(c.Resolve<IDbConnectionFactory>(), HostContext.LocalCache))) .ConfigureAppHost(appHost => { // Create `ValidationRule` table if it doesn't exist in AppHost.Configure() or Modular Startup appHost.Resolve<IValidationSource>().InitSchema(); }); } ``` Which the built-in [Validation Feature](/validation.html#validation-feature) detects before registering the `GetValidationRules` and `ModifyValidationRules` management APIs and enables the DB Validation Admin UI: <div class="block p-4 rounded shadow"> <img src="/img/pages/admin-ui/validation-empty.png"> </div> ### Pre-populating Validation Rules A minimum set of validation rules can be enforced by adding them on Startup, e.g: ```csharp var validationSource = container.Resolve<IValidationSource>(); validationSource.InitSchema(); validationSource.SaveValidationRules(new List<ValidateRule> { new ValidateRule { Type=nameof(CreateTable), Validator = "NoRefTableReferences" }, new ValidateRule { Type=nameof(MyRequest), Field=nameof(MyRequest.LastName), Validator = "NotNull" }, new ValidateRule { Type=nameof(MyRequest), Field=nameof(MyRequest.Age), Validator = "InclusiveBetween(13,100)" }, }); ``` This can also be used to support alternative data sources by pre-populating validation rules in an `MemoryValidationSource`, although the recommendation would be to implement [IValidationSourceAdmin](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.Interfaces/ValidationRule.cs) to get the full features of the Admin Validation UI. ### Validation UI We can start adding validation rules after selecting the API we want to add them to, tag groups provide a quick view popup allowing APIs to be selected with just a mouse, whilst groups with a large number of APIs can benefit from the Autocomplete textbox results filter. <div class="block p-4 rounded shadow"> <img src="/img/pages/admin-ui/validation-category.png"> </div> The quick links help navigating between related AutoQuery APIs that allows jumping between different APIs with the same Data Model. In the validation editor you'll be able to create **Type** and **Property** Validation Rules that either make use of an existing **Validator** or you can enter a custom [#Script](https://sharpscript.net) expression that validates to `true`. The DB Validationo UI is smart and will list all built-in and Custom Script Methods returning `ITypeValidator` or `IPropertyValidator` that's registered in the remote instance. The pre-defined list of validators are displayed in a list of "quick pick" buttons that enables fast adding/editing of validation rules. ### Verified Rules The `ModifyValidationRules` API used to save validation rules performs a number of checks to ensure any Validation rules are accurate including executing the validator to make sure it returns the appropriate validator type and checking the syntax on any **Script** validation rules to ensure it's valid. <div class="block p-4 rounded shadow"> <img src="/img/pages/admin-ui/validation-category-CategoryName.png"> </div> <div class="mt-4 block p-4 rounded shadow"> <img src="/img/pages/admin-ui/validation-category-Type.png"> </div> The `ModifyValidationRules` back-end API also takes care of invalidating the validation rule cache so that any saved Validators are immediately applied. Despite being sourced from a DB, after the first access the validation rules are cached in memory where they'd have similar performance profile to validators declaratively added on Request DTOs in code. After you add your validation rules they'll be immediately enforced when calling the API, e.g. in [API Explorer](/api-explorer) or [Locode](/locode/). Be mindful of what Validation Rule you're adding to which DTO, e.g. a validation rule added to **CreateCategory** API will **only be applied** when **creating** entities, e,g. not for full entity or partial field updates. <div class="mt-4 block p-4 rounded shadow"> <img src="/img/pages/admin-ui/validation-category-create.png"> </div> --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/admin-ui.md --- slug: admin-ui title: Admin UI --- The Admin UI contains a suite of Admin tools providing a number of productivity features ranging from Managing Users and DB Validation Rules to gaining unprecedented insights into Live running Apps with in-depth Request Logging & Profiling where you'll be able to observe your App's behavior in real-time. The [Admin UI](/admin-ui) is built into all ServiceStack Apps, accessible to [Admin Users](/debugging#admin-role) from: <div class="not-prose text-center pb-3"> <h3 class="text-4xl text-indigo-800">/admin-ui</h3> </div> Which will launch the Admin UI: <analytics-ui class="not-prose"></analytics-ui> ### Dashboard On first access you're greeted with the Admin dashboard showing high-level overview stats on the number and type of APIs your App has as well as internal stats counters surfaced right on your Dashboard where they can provide valuable insights into the health of different features at a glance: ![](/img/pages/admin-ui/dashboard-features.png) ### Advertised features As Admin is a capability-based UI it only shows the stats and features your App has enabled. To aid in discovery the dashboard now includes a light switch of available Admin features with a link to [Admin UI Feature Docs](/admin-ui-features), providing a summary of each Admin UI component and instructions on how to enable them. ### Admin UI Features Explore the available Admin UIs to learn more about each of their capabilities: ### [Managing Users UI](/admin-ui-users) Containing user management functionality for creating & modifying users, assigning Roles & Permissions, locking or updating passwords: <a href="/admin-ui-identity-roles" class="not-prose"> <div class="block p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/admin-ui/identityauth-roles.webp"> </div> </a> Quick start: Create a new [ASP.NET Identity Auth Template](https://servicestack.net/start). ### [Profiling & Logging UI](/admin-ui-profiling) Enables invaluable observability into your App, from being able to quickly inspect and browse incoming requests, to tracing their behavior: <a href="/admin-ui-profiling" class="not-prose"> <div class="block p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/admin-ui/admin-ui-logging.png"> </div> </a> Enable Profiling: :::copy npx add-in profiling ::: Enable RDBMS Request Logging: :::copy npx add-in db-requestlogs ::: Enable SQLite Request Logging: :::copy npx add-in sqlitelogs ::: ### [Redis Admin](/admin-ui-redis) Manage your App's configured Redis Server, query & edit core Redis data types and execute custom redis commands: <a href="/admin-ui-redis" class="not-prose"> <div class="block p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/admin-ui/admin-ui-redis.png"> </div> </a> Quick start: :::copy npx add-in redis ::: ### [Database Admin](/admin-ui-database) Quickly browse and navigate your App's configured RDBMS schemas and tables: <a href="/admin-ui-database" class="not-prose"> <div class="block p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/admin-ui/admin-ui-database.png"> </div> </a> Quick start: ```csharp services.AddPlugin(new AdminDatabaseFeature()); ``` ### [DB Validation UI](/admin-ui-validation) Leverages the existing Declarative Validation infrastructure to enable dynamically managing Request DTO Type and Property Validators from a RDBMS data source <a href="/admin-ui-validation" class="not-prose"> <div class="block p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/admin-ui/admin-ui-validation.png"> </div> </a> Quick start: :::copy npx add-in validation-source ::: ### [AI Chat UI](/ai-chat-analytics) A unified API for integrating multiple local and cloud LLMs into your applications providing financial visibility into your AI operations with interactive visualizations showing spending distribution across providers and models. <a href="/ai-chat-analytics" class="not-prose"> <div class="block p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/ai-chat/admin-chat-costs.webp"> </div> </a> Quick start: :::copy npx add-in chat ::: ### Feedback The Admin UI was designed with room to grow. Please let us know what other features you would like to see on our [GitHub Discussions](https://github.com/ServiceStack/Discuss/discussions/2). --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/advanced-deployment-octopus-deploy.md --- title: Advanced Deployments with Octopus Deploy --- ## Deploying and Installing SSL certificates with Octopus Deploy There's a good chance at some stage you're going to want to have sensitive environment specific information that you don't to want to store in source control, especially if your project is open source. This could be SSL certificates, license information or Production AppSettings like private database passwords, production connection strings or anything else you need to run your application. One solution is to package all application settings and other dependencies on the build server. Using [TeamCity](http://www.jetbrains.com/teamcity/) and [Octopus Deploy](https://octopusdeploy.com/), packaging these settings up and deploying along with the application is easy to manage, flexible process that can streamline and automate even complicated application deployments. Whilst the installation of SSL certificates, IIS bindings and other tasks here aren't time consuming to do manually, being able to get a new server up and running quickly by automating as much as possible has many benefits including consistent, replay-able deployments and not being tied to a particular infrastructure provider allowing you to easily switch to better valued hosting providers like [Hetzner](http://www.hetzner.de/en/) should you wish to in future. ## Getting started This example is going to be an extension on the pattern used in the [Deploy multiple sites to a single AWS instance deployment guide](/deploy-multiple-sites-to-aws), but with an additional TeamCity step and a few extra process steps for Octopus Deploy to deploy an application with additional components and configuration. 1. **Settings** file read by ServiceStack application 2. ServiceStack **license file** 3. **SSL certificate** for HTTPs binding As an overview, the TeamCity steps for building and deploying a **simple application** goes something like: ### [TeamCity Steps](/deploy-multiple-sites-to-aws#teamcity-installation-and-setup) #### Build application 1. Restore NuGet dependencies 2. Package application into .nupkg #### Deploy application 1. Notify Octopus Deploy of new version available from TeamCity NuGet feed With the added settings package, we are adding an additional step to the Build application list above. 1. **Package settings and other required environment files into .nupkg** 2. Restore NuGet dependencies 3. Package application into .nupkg #### Creating the NuGet package To package the files into a NuGet package we will need to create a **nuspec** file. A .nuspec file is a [package specification](http://docs.nuget.org/docs/reference/nuspec-reference) that contains a list of files to include in the package, and this is where we have to specify the 3 files above, e.g: ```xml <?xml version="1.0"?> <package xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd"> <metadata> <id>BenchmarkAnalyzerSettings</id> <version>1.0.0</version> <authors>ServiceStack</authors> <requireLicenseAcceptance>false</requireLicenseAcceptance> <description>Settings for the Benchmark Analyzer project</description> </metadata> <files> <file src="appsettings.license.txt" /> <file src="appsettings.txt" /> <file src="httpbenchmarks.pfx" /> </files> </package> ``` Once we have created a valid .nuspec file, we can get TeamCity to create a NuGet package using the **NuGet Pack runner type**. The main settings you will need for this configuration is the path where TeamCity can find the .nuspec file. If you are hosting your own TeamCity instance, this is just a local directory. However, if you are hosting a TeamCity instance separately, eg on a dedicated build server, you will need to copy these files local to the build server to package them or use another VCS that is secure for application settings. In this example, we will be packing the files from a known path on build server. ::: info If you are having issues with this step, check the file/folder permissions to the .nuspec and included files ::: ![](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/tc-nuget-pack.png) **Output directory** is the same as the **Artifact paths** in the **General Settings** of the build configuration for building the application package: ![](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/tc-build-config-paths.png) So now our build steps for our application build and packaging steps looks like: ![](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/tc-build-config.png) If everything is building correctly, you should get two **nupkg** files as artifacts from the build configuration, your application and settings package: ![](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/tc-project-output.png) If you are using TeamCity’s built in NuGet server, these packages are published are available to use from Octopus Deploy. ## [Octopus Deploy process steps](/deploy-multiple-sites-to-aws#octopus-deploy-installation-and-initial-setup) For a [simple application](/deploy-multiple-sites-to-aws#setting-up-octopus-deploy-projects) with no application settings or SSL certificate to install we only have 2 steps for Octopus Deploy. ![](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/od-simple-example.png) However, due to the need to automate the installation of all the projects dependencies, we have to add at least another 3 steps. We will need the following list of steps to happen before our newly published application is being hosted with the right settings, license and SSL binding. 1. **Publish settings package** 2. **Install SSL if required** 3. Deploy Application Package 4. **Copy Production Settings and License** 5. Update permissions ## Publish settings package Publish settings is a "**Deploy Package**" process step looking at the TeamCity NuGet feed and using the new "BenchmarkAnalyzerSettings" package. ![Publish settings configuration](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/od-publish-settings.png) >If you can’t see the [NuGet feed from TeamCity](/deploy-multiple-sites-to-aws#octopus-deploy-installation-and-initial-setup), you need to [set up a new NuGet feed](http://docs.octopusdeploy.com/display/OD/Package+repositories) under Library in Octopus Deploy. ## Install SSL If Required Next we will need to install the SSL certificate if it is not already installed. To do this, we can add some PowerShell within the Octopus Deploy Library under **Script Modules**. A script module can be included in a process step, this gives you the chance to reuse some of the more common PowerShell between steps or projects. In this example, we are going to write some PowerShell to help us **install the certificate**. ```powershell function Add-SSLCertificateToIIS{ param([string]$certPath,[string]$pfxPassword) Write-Output "Importing certificate into ISS from $certPath" $certMgr = New-Object -ComObject IIS.CertObj -ErrorAction SilentlyContinue $certMgr.ImportToCertStore($certPath,$pfxPassword,$true,$true) } ``` This PowerShell function takes a path of a pfx file, a password and installs it to IIS to be used in a HTTPS binding. Once we’ve added this script to a new Script module, we can **call the function from a process step**. If we add a new process step of "Install SSL If Required" and run another script that utilizes our new function, we can install the certificate that was packaged in our settings NuGet package. ![](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/od-ssl-step.png) The script in the screenshot above uses a couple of Octopus Deploy variables, as well as results from previous "Publish settings" step. ```powershell Write-Output "Checking SSL certificate" $cert = ( Get-ChildItem -Path cert:\LocalMachine\My| ? { $_.subject -like "*cn=$CertCn*" }) if(!$cert) { Write-Output "SSL certificate missing, installing..." $certDirPath = $OctopusParameters['Octopus.Action[Publish settings].Output.Package.InstallationDirectoryPath'] Add-SSLCertificateToIIS "$certDirPath\$CertFileName" $SSLKey } else { Write-Output "SSL present, skipping installation..." } ``` `$SSLKey`, `$CertFileName` and `$CertCn` are all **Octopus Deploy variables** we have declared in the project variables section, we access them just like any declared PowerShell object. The script itself is simply checking the `LocalMachine\My certificate` store that contains a specific string in the **subject** property, if no certificate that matches the CN value is present, it installs the certificate from where it was published using an encrypted password stored within Octopus Deploy. Generally, this will only install the certificate once on first install of the new virtual machine, but will reinstall it if for whatever reason it has to be manually removed. ## Copy Production Settings and License The last new process step that is included is a straight forward copy from source location to a destination. This step happens **after** the deployment of the application as the destination is not yet known until this step has completed. ```powershell $src = $OctopusParameters['Octopus.Action[Publish settings].Output.Package.InstallationDirectoryPath'] $dst = $OctopusParameters['Octopus.Action[Deploy Package].Output.Package.InstallationDirectoryPath'] Write-Output "Copying environment files from $src to $dst" Get-ChildItem $src | Copy-Item -Destination $dst -Exclude "*.pfx" ``` The script also **excludes copying the pfx certificate** as this is not something we want to host in IIS application. Another step that can be added is one to delete the SSL certificate after step installation so this exclusion wouldn’t be required. Thankfully, there are some templates for common tasks like deleting files are available to add to your Octopus Deploy instance from an [open source project](https://github.com/OctopusDeploy/Library). Here is an example of the File System – Clean Directory template installed and used to clean pfx files from the output directory. ![View of template once installed](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/od-clean-directory.png) ## Using the SSL binding We can now set which binding to use in IIS in the Deploy Package step which deploys the main application. To use a specific SSL certificate you will need to provide the **SSL Thumbprint**. An easy way to find the Thumbprint of the certificate is to open the original **.crt** file, select the **Details** tab and select **Properties Only** from the **Show** drop down. One you have this value, you will need to provide it in the Add binding menu when adding an IIS binding. ![Add HTTPS IIS binding in deployment step](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/od-add-binding-https.png) This thumbprint associates the binding with the newly installed certificate. Once installed, the certificate will show up in the IIS Management window under Server Certificates and the specified website will have the HTTPS binding associated with the installed certificate. ![IIS Manager after deployment](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/iis-certificates.png) ## HTTP binding redirect So far we have setup all we need for the HTTPS binding, but since the default protocol when navigating to a site directly is HTTP, we want to setup a URL rewrite so that any requested resource on HTTP is redirected to HTTPS. To so this we will need to add some configuration in the Web.Config.Release transformation. ```xml <system.webServer> <rewrite xdt:Transform="Insert"> <rules> <rule name="Redirect to HTTPS" stopProcessing="true"> <match url="(.*)" /> <conditions> <add input="{HTTPS}" pattern="^OFF$" /> </conditions> <action type="Redirect" url="https://{HTTP_HOST}/{R:1}" redirectType="SeeOther" /> </rule> </rules> </rewrite> </system.webServer> ``` This configuration uses an IIS module that may or may not be installed on your server. It can be installed via the [Web Platform Installer](http://www.microsoft.com/web/downloads/platform.aspx) or installed separately, see the [IIS site for more details](http://www.iis.net/downloads/microsoft/url-rewrite). We will **also need a HTTP binding** so the redirects can take place. IIS configuration in Octopus Deploy allows multiple IIS bindings for a deployed package. To add a HTTP binding, it is the same the same as step above, but selecting HTTP from the Protocol drop down. ### Managing settings files yourself vs Octopus Deploy variables In this example we are storing both the application SSL certificate (.pfx file) for use with IIS, application settings and license in text files we are managing ourselves. To import a certificate and install it onto a new virtual machine we can’t get around the need to deploy the pfx file. However, there are many ways to handle application settings including the ServiceStack license. Octopus Deploy has some really useful features when it comes to handling applications settings, even environment specific passwords etc. You could use this for storing the ServiceStack license itself as well as database credentials and other [sensitive information](http://docs.octopusdeploy.com/display/OD/Security+and+encryption). ![Octopus Deploy Variables](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/od-variables.png) Although not used in this example, Octopus Deploy variables can replace web.config `appSettings` values and then can feed to [ServiceStack’s AppSettings and its more useful data-structures](/appsettings#example-usage). To enable this functionality, remember to enable it in your application's deploy step. ![](https://github.com/ServiceStack/Assets/raw/master/img/wikis/octopus-deploy-ssl/od-replace-app-settings.png) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/locode/advanced.md --- title: Advanced Locode Features --- ## Pre-populated Reference Data As we look for ways to improve productivity in Locode now pre-populates referential data from navigated references, e.g. when creating a new Job Application after navigating to a [Job's Applications in Talent Blazor](https://talent.locode.dev/locode/QueryJob) it uses this context to pre-populate the Job it's filtered by: ![](/img/pages/locode/prepopulated-related-data.png) ## Support for Large Apps The built-in capability-based UI's are powered from your APIs metadata, as more of our Customers start to make use of these new UIs in their workflow we've had reports from some customers with [Large Apps (550+ APIs)](https://forums.servicestack.net/t/api-explorer-hangs-on-large-service-layer/10743) that the UIs started to hang their browsers when it tried to process the **9.5MB** of generated metadata. To support Larger Apps we've added the ability to restrict the metadata and UIs to only related APIs in [user-defined Tag Groups](/api-design#group-services-by-tag) by adding `?IncludeTypes` to the URL, e.g: - `/ui/?IncludeTypes={tag}` This follows the Include Types pattern where you can view multiple tag groups with: - `/ui/?IncludeTypes={tag1},{tag2}` This feature is supported in all built-in UIs and is now integrated on the **/metadata** page where if you select a tag the API Explorer link will be updated with **?IncludeTypes={tag}**: ![](/img/pages/locode/locode-tags-filter.png) Where you'll now be able to open API Explorer restricted to APIs with that tag without needing to manually craft the URL. ## Localize Metadata To assist with with creating Localized Locode UIs, all user-defined descriptive text is now routed through to your AppHost's `ResolveLocalizedString()` method which you can use to return a localized string for the current request, e.g: ```csharp public override string ResolveLocalizedString(string text, IRequest request = null) { return request != null ? MyResolveLocalizedString(text, request.Headers[HttpHeaders.AcceptLanguage]) : text; } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/advantages-of-message-based-web-services.md --- slug: advantages-of-message-based-web-services title: Advantages of message-based WebServices --- ### This is in response to a recent question from [mailing group](https://groups.google.com/forum/?fromgroups#!topic/servicestack/qkV5fzdnzt8): > It seems like ServiceStack is designed for use primarily in a greenfield SOA implementation where the technology environment is quite homogeneous, and more or less the same people have ownership of the servers and all the clients. Is that correct? ServiceStack's message-based design is optimal for the design of **any remote service**. We believe .NET has never gotten web services right, which was the inspiration for starting ServiceStack. If .NET was lucky enough to have had someone like Martin Fowler (or just someone following his decade-old guidance) at the helm of the Microsoft Patterns & Practices and VS.NET tools teams from the start we would've likely been able to avoid the multiple replacement web service frameworks from Microsoft that .NET web service developers have endured over the years - and still haven't got right. This is unfortunate considering remote services are the most important APIs developers can create as they ultimately offer the highest-level of software re-use possible whilst remaining programmatically composable. ## Best-practices for remote services Inspiration should've ideally been taken from Martin Fowler or from companies that have [SOA](http://en.wikipedia.org/wiki/Service-oriented_architecture) ingrained in their DNA who have successfully run long-term evolving SOA solutions that have enabled rich and complex platforms. Amazon is a shining example of this, where even Steve Yegge admits [it's the only thing they do better than Google](https://gigaom.com/2011/10/12/419-the-biggest-thing-amazon-got-right-the-platform/). Where their relentless dedication to exposing SOA services over all their systems have enabled their industry leading EC2 and [aws.amazon.com](http://aws.amazon.com/) cloud services. ## The Service API of Amazon's Web Services If you look at an [example of Amazons EC2 Web Service APIs](http://docs.amazonwebservices.com/AWSEC2/latest/APIReference/ApiReference-query-AttachVolume.html) you'll see a strong similarity with ServiceStack's approach where they accept a Request message and return a response message for all their services, e.g: ### Example Request ``` https://ec2.amazonaws.com/?Action=AttachVolume &VolumeId=vol-4d826724 &InstanceId=i-6058a509 &Device=/dev/sdh &AUTHPARAMS ``` ### Example Response ```xml <AttachVolumeResponse xmlns="http://ec2.amazonaws.com/doc/2012-06-01/"> <requestId>59dbff89-35bd-4eac-99ed-be587EXAMPLE</requestId> <volumeId>vol-4d826724</volumeId> <instanceId>i-6058a509</instanceId> <device>/dev/sdh</device> <status>attaching</status> <attachTime>2008-05-07T11:51:50.000Z</attachTime> </AttachVolumeResponse> ``` From this we can attest Amazon maintains a Request [DTO](http://en.wikipedia.org/wiki/Data_Transfer_Object) named `AttachVolume` and a Response DTO named `AttachVolumeResponse`. This is the same design ServiceStack encourages and with some minor customisations the Request can be easily made to be more REST-ful with: ``` POST https://ec2.amazonaws.com/volumes/vol-4d826724/attach FormData: InstanceId=i-6058a509&Device=/dev/sdh&AUTHPARAMS ``` In-terms of accessibility and interoperability, ServiceStack 1-ups Amazon here since the same Request DTO can be populated with any combination of a [Custom Route, QueryString, FormData or a POST'ed XML, JSON, JSV, SOAP 1.1/1.2 or ProtoBuf Request DTO payload](/architecture-overview). Although this is really inconsequential since both Amazon and ServiceStack also provide **typed service clients** so you're never required to manually construct the request by hand. ## Messaging at Google Although Amazon holds the SOA edge, Google, like Amazon also benefits from message-based design for nearly all their internal communications using their own Data Interchange Format - [Protocol Buffers](http://code.google.com/p/protobuf/), which like JSON is both fast, compact and tolerant: ::: info Protocol Buffers are a way of encoding structured data in an efficient yet extensible format. Google uses Protocol Buffers for almost all of its internal RPC protocols and file formats ::: A simple DSL is used to define their Protocol Buffer message DTOs: ```js message Person { required int32 id = 1; required string name = 2; optional string email = 3; } ``` From this they use their `protoc` command-line utility to generate native types in C++, Java and Python, which like Amazon and ServiceStack, enables them to benefit from using an end-to-end, typed API. > Add a reference to [ServiceStack.ProtoBuf](http://nuget.org/packages/ServiceStack.ProtoBuf) to enable [@marcgravell's](http://stackoverflow.com/users/23354/marc-gravell) excellent implementation of Protocol Buffers: [protobuf-net](https://github.com/mgravell/protobuf-net). ## A Productivity Win We've been building SOA systems like this with ServiceStack for years: the productivity boost you get from using typed, end-to-end, resilient, message-based API is unmatched. An example of this was having developed over 100+ Web Services for the [Redis WebServices](http://redisadminui.servicestack.net/redis/metadata) project in just 1 weekend. Others that have tried ServiceStack also agree its ease-of-development and **"pit of success"** design it promotes ultimately yields a productivity win - this positive sentiment is captured in [@ServiceStack's favorites](https://twitter.com/ServiceStack/likes) and throughout the [mailing group](https://groups.google.com/forum/?fromgroups#!forum/servicestack). ### The anatomy of a ServiceStack service For normal services, ServiceStack is an invisible library i.e. it lets you implement your service in pure, untainted C# accepting any user-defined Request and lets you return any Response DTO without any regard to endpoints and formats. Any dependencies your services need can be declared as public properties and are automatically auto-wired on each request. A complete example of this is the self-contained [Backbone TODO backend persisted using Redis](https://github.com/ServiceStack/ServiceStack.Examples/blob/master/src/Backbone.Todos/Global.asax.cs). ### [In contrast with SOAP](http://www.infoq.com/articles/interview-servicestack) By contrast we've witnessed .NET devs struggling to implement much fewer SOAP web services within the same timeframe, especially when they're fighting un-expected and unknown WCF interoperability issues. Not only is SOAP more verbose and slower, its less tolerant and version-able, it was never a good choice for the open web and is now effectively deprecated. ## The many webservice frameworks of Microsoft Unfortunately despite Microsoft having hosted Martin Fowler's respected [Data Transfer Object](http://msdn.microsoft.com/en-us/library/ff649585.aspx) and [Service Gateway](http://msdn.microsoft.com/en-us/library/ff650101.aspx) patterns on MSDN for years - none of their web frameworks have encouraged their use. Instead in .NET we've been forced to code against the multiple generation of replacement web service frameworks they've churned out over the years like .asmx, CSF, WCF, WCF/REST, WSE, WCF DataServices, RIA, MVC (escaping earlier cruft) and now WebApi. Each of these frameworks share the same mistake of mapping to C# methods, which we believe is a terrible idea for network services since it promotes chatty and brittle remote interfaces, that fail to facilitate the easy creation of SOA-like Apis. Throughout all these generations of frameworks ServiceStack's underlying core message-based design has remained a constant powerful primitive that drives much of its simplicity. At a minimum ServiceStack Services just need to implement the empty marker interface: ```csharp public interface IService { } ``` Which lets you handle any HTTP Verb, as well as a 'Catch All' **Any** fall-back to handle any un-specified HTTP verbs, e.g: ```csharp public class MyService : IService { public Response Get(Request request) => ...; public Response Post(Request request) => ...; //Fallback for Anything else e.g DELETE, PUT, PATCH, OPTIONS, etc. public Response Any(Request request) => ...; } ``` It simply accepts any user-defined Request DTO and returns any Response DTO - that you're given complete freedom to create. If ever more customization/headers is needed you can return the decorated response inside a `HttpResult` or `HttpError` to maintain full control over the HTTP output. ### Message APIs minimizes round-trips, creates fewer, more re-usable and extensible services Messages APIs are naturally batchful and promote the development of coarse-grained service interfaces. This encourages fewer, more re-usable services that are better positioned for extensibility - this is a key benefit, since well-defined (i.e. non RPC/client-specific) back-end services tend to out live the UIs and clients that consume them. This is illustrated in this example between the [different style of services that WCF and ServiceStack encourages](https://gist.github.com/1386381). Another example showcasing the differences is in many of jQuery's APIs that take in an array of key/value pairs, like [$.ajax()](http://api.jquery.com/jQuery.ajax/). Imagine if every configuration permutation was a different or overloaded method? This gets unwieldy, very quickly. A coarse-grained interface enables richer functionality in a single call, whilst sharing the same well-tested code-path. ### Code-first POCO's Since it promotes clean, re-usable code, ServiceStack has always encouraged the use of code-first [POCO](http://en.wikipedia.org/wiki/Plain_Old_CLR_Object)'s for just about everything. i.e. the same POCO can be used: - In Request and Response DTO's (on client and server) - In [JSON, JSV and CSV Text Serializers](/formats) - As the data model in [OrmLite](/ormlite/), [db4o](http://code.google.com/p/servicestack/source/browse/#svn%2Ftrunk%2FCommon%2FServiceStack.DataAccess%2FServiceStack.DataAccess.Db4oProvider) and [NHibernate](http://code.google.com/p/servicestack/source/browse/#svn%2Ftrunk%2FCommon%2FServiceStack.DataAccess%2FServiceStack.DataAccess.NHibernateProvider%253Fstate%253Dclosed) - As the entities stored in [Redis](/redis/) - As blobs stored in [Caches](/caching) and [Sessions](/auth/sessions) - Dropped and executed in [MQ's services](/redis-mq) - Dehydrating complex configurations into Leveraging different technologies whose functionality is built around POCO's offer un-precedented levels of re-use, reduces friction, promotes consistent, more usable and easier to rationale code-bases. ## The case against mapping to method signatures The programmer convenience and familiarity of using method signatures for Service APIs is just not worth what you give up: It encourages developers to treat web services as just another method call even though they're millions of times slower, the different properties between service endpoints and C# methods highlight more of the short comings of this approach where in order to be able to evolve your services without friction, services should be both forward and backwards compatible - not fail-fast. When you evolve and refactor a C# method you have the opportunity to refactor all the call sites to meet the new method signature - whereas in a disconnected (and already deployed) client/server solution you don't, you need to support both old and new clients requests and the only way to do this cleanly whilst still maintaining the same code-path is to pass messages between them. But the main disadvantage of method signature service APIs is that they mandate the use of code-gen in order to provide a typed client API. Using messages allows you to re-use generic service clients for all your service communications. This is how, even up to this day ServiceStack remains the only .NET framework to maintain a terse, (both sync and async), typed, end-to-end client libraries without any code-gen, e.g: ```cs Todo createdTodo = client.Post(new Todo { Content = "New Todo", Order = 1 }); ``` ### Code-gen'ing service clients is evil Although a subject of another post we consider code-gen an arthritis that imposes undue friction to a project, it adds un-necessary build steps, increases compile times, forces lock-step deployment of client/server endpoints (usually requiring downtime), inhibits DRY/code re-use amongst code-gen types, has your domain logic binded to external moving types (that are outside of your control), are less resilient since code-gen types parse the entire payload - so unwanted breakages can occur on changes to un-used parts of the request or response. The only way to maintain a succinct API that maps to method signatures without code-gen is to use dynamic. We're a fan of dynamic languages for most (fuzzy) development tasks like creating/binding UIs, html generation, scripting, etc. But we don't think they're optimal for creating evolving service APIs with - which stand to benefit most from statically typed annotations, compiler warnings and the refactoring support that statically typed languages can provide. ### Twitter benefitting from Typed Services After having rewrote their Ruby APIs in Scala, Twitter also comes to the same conclusion: http://www.infoq.com/articles/twitter-java-use > I would say about half of the productivity gain is purely because of accumulated technical debt in the search Rails stack. And the other half is that, as search has moved into a Service Oriented Architecture and exposes various APIs, static typing becomes a big convenience in enforcing coherency across all the systems. You can guarantee that your dataflow is more or less going to work, and focus on the functional aspects. Whereas for something like building a web page you don't want to recompile all the time, you don't really want to worry about whether in some edge condition you are going to get a type you didn't expect. But as we move into a light-weight Service Oriented Architecture model, static typing becomes a genuine productivity boon. And Scala gives you the same thing. ## Advantages of Message-based designs So in contrast to method signatures, message-based designs offer many advantages: - They're easy to version and evolve since you're freely able to add/remove functionality and properties without error - They're easy to route, chain and decorate through to different handlers and pipelines - They're easy to serialize and proxy through to remote services - They're easy to record, defer and replay - evident by ServiceStack's [IMessaging API](/redis-mq) which can automatically drop one-way services into MQ's and have them executed inside an MQ Host, you get this functionality for free since the MQ host can re-use the same web service implementation. It works the other way too where you can supply a Url in the ReplyTo property to have the MQ response POST'ed to a ServiceStack HTTP Service. - They're easy to log, evident by ServiceStack's trivially simple but useful [IRequestLogger service](/request-logger) - They're easy to map and translate to and from domain models using convention and auto mappers - Ideal for concurrency as immutable messages are thread-safe and can be easily multi-plexed with their handlers over multiple threads. All these properties make messages a better choice for distributed systems, where all this functionality can be achieved with generic solutions since it's much easier to pass around and serialize messages than method signature invocations. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="Vae0ALalIP0" style="background-image: url('https://img.youtube.com/vi/Vae0ALalIP0/maxresdefault.jpg')"></lite-youtube> ### Message designs are well known Although none of these qualities are new, message-based designs have been very well known for decades and are used in many of the most highly respected real-time distributed technologies: including Erlang processes, Java's JMS, Scala's Actors, Go's go-routines, Dart's isolates, F#'s Mailboxes as well as in industrial strength Message Queues which (amongst many other benefits) provide deferred, durable and reliable execution - which are a strong force in most other platforms. ### Messaging practically non-existant in .NET They're just weak in .NET since Microsoft has an un-naturally strong influence over .NET developer mindshare and have been able to steer mainstream .NET into using SOAP Web Services at the expense of MQ's and message-based solutions. Our guess for this was because they've only had a poor, outdated MQ option in MSMQ and that the 1) right-click 'Add Service Reference' and 2) call a remote service like a c# method, demos very well. This could change with the advent of [Azure Service Bus](https://azure.microsoft.com/en-us/services/service-bus/) if Microsoft devotes some attention into pointing mainstream .NET devs towards a messaging approach. Although with the imminent release of WebApi and Microsoft's full marketing force, army of employees and full-time developer advocates behind it, they will likely be once again successful into moving most of mainstream .NET onto yet another replacement web-service framework for all their remote communication needs - when often message-based / SOA designs provide a better fit. For those interested in discovering advantages of MQ's and message-based designs and their abilities in enabling loosely coupled distributed systems, we recommend the excellent book from Martin Fowler's signature series - [Enterprise Integration Patterns](http://www.eaipatterns.com/). ## So is ServiceStack only suitable for homogeneous environments that control client / server? This is fairly inaccurate considering ServiceStack's mission is close to the exact opposite: i.e. to encapsulate and empower your services logic and host them in the most accessible and re-usable ways possible on the most popular endpoints and formats. This is, after all what the core objectives of a service should be, i.e. to expose high-level functionality in the most accessible way and consume them in least-effort possible. ### Excellent built-in support of HTTP on most open and popular formats Not only do all ServiceStack services accept a Request DTO populated with any combination of Custom Routes, QueryString and HTML FormData. You can also POST the serialized Request DTO in any of the in-built formats: i.e. XML, JSON, JSV, SOAP 1.1/1.2, ProtoBuf (when enabled) or your own [custom format](https://northwind.netcore.io/vcard-format.htm). All services immediately support JSONP and its trivial to [enable CORS on all services](http://stackoverflow.com/questions/8211930/servicestack-rest-api-and-cors). Should you wish, you're also able to have all HTTP Verbs execute the same service. Support is included for registering raw custom IHttpHandler's, [Request / Response Filters](/request-and-response-filters) and HttpResult/HttpError results - giving you a multitude of options to maintain full control over the entire HTTP Output - should you need to meet any extraneous external requirements. ### Text serializers are fast, resilient and idiomatic ServiceStack's JSON & JSV serializers are **case-insensitive** (i.e. supports both camelCase and PascalCase properties) and the 1-line below (already included in most Example templates) emits idiomatic camelCase JSON output: ```cs JsConfig.Init(Config { TextCase = TextCase.CamelCase }); ``` They're both [very resilient and can withstand extreme versioning without error](./redis/schemaless-migration.md) making it easy to consume 3rd party APIs. ### Your services can be consumed by more than just HTTP Your services implementation can even be re-used inside any [IMessageService hosts](/redis-mq), which at this time includes support for [Background MQ, Rabbit MQ, Redis MQ, AWS SQS and Azure Service Bus MQ Servers](/messaging) and for Maximum Performance Services can also be consumed from [high-performance HTTP/2 gRPC endpoints](/grpc/) and its universe of [protoc Generated Clients](https://grpc.servicestack.net/). All these features actually make ServiceStack one of the most versatile and flexible web service frameworks in existence - enabling your services accessible in a myriad of different use-cases. ### Most examples don't actually have .NET clients This is also a peculiar assumption in light of the fact that most [ServiceStack Examples](https://github.com/ServiceStack/ServiceStack.Examples/) (as seen on http://servicestack.net) are actually Single Page Apps being consumed with Ajax clients (i.e. no .NET clients in sight). The [Backbone TODOs example](http://www.servicestack.net/Backbone.Todos/) shows how trivial it was to retrofit Backbone's REST service with a ServiceStack backend and the [SocialBootstrapApi](https://github.com/ServiceStack/SocialBootstrapApi) example show-cases an internet-ready Single Page Backbone App with Facebook, Twitter, HTML Form credentials, Basic Auth & Digest Auth, all enabled. It also makes use of ServiceStack's [cross-platform node.js bundler](https://github.com/ServiceStack/Bundler) for statically compiling, minifying and concatenating your websites .coffee, .js, .less, .sass and .css assets - and since it works headless and without .NET, is able to be used for non .NET projects as well. ### .NET clients can benefit from a typed API .NET clients do benefit from being able to re-use the same types you've defined your web services with, which in addition to [pre-defined auto routes](http://www.servicestack.net/ServiceStack.Hello/#predefinedroutes) and generic service clients, is what enables the typed, end-to-end client gateways. ### Just as consumable as any other web service But even without this, ServiceStack services are just as consumable as any other web service framework since they're just pure DTOs serialized into the preferred Content-Type, with no additional bytes or wrappers added to the response. Different client examples in contrast: The earlier typed API example of creating a new TODO in C#: ```csharp var client = new JsonApiClient(baseUrl); Todo createdTodo = client.Post(new Todo { Content = "New Todo", Order = 1 }); ``` Is like this in [TypeScript](/typescript-add-servicestack-reference): ```ts var client = new JsonApiClient(baseUrl); var request = Todo(); request.Content = "New Todo"; request.Order = 1; client.post(request) .then((createdTodo) => ...) ``` and this in Dart (using the [Dart JSON Client](https://github.com/mythz/DartJsonClient)): ```dart var client = new JsonClient(baseUrl); client.todos({'content':'New Todo', 'order':1}) .then((createdTodo) => ...); ``` Or in jQuery: ```js $.post(baseUrl + '/todos', {content:'New Todo', order:1}, function(createdTodo) { ... }, 'json'); ``` And you still have the option to consume all services in other Content-Types. Some languages may prefer to deal with XML - which can easily be accessed by adding the appropriate `Accept` and `Content-Type` headers. ## What's the best way to expose our services to clients today? ### What was SOAP's original vision? Back in the W3C glory days, existed think-tanks who imagined a beautiful world were you could easily discover and connect to services through the magical properties of UDDI and WSDLs. However this pipe-dream never came to pass, the closest .NET got was the 'Add Service Reference' dialog ingrained in VS.NET - which we refer to internally as the technical-debt-creating-anti-pattern dialog :) ### So how should we expose our services now? The natural choice would be to just document your services' uniform HTTP Interface using XML or JSON REST APIs. This is a popular choice for many companies today, some good examples include [StackOverflow](http://api.stackoverflow.com/1.0/usage) and [GitHub](http://developer.github.com/v3/). This approach easily supports dynamic clients who are able to trivially consume JSON responses. ### Native language client libraries A more productive option for clients however would be to provide a native client library for each of the popular languages you wish to support. This is generally the approach of companies who really, really want to help you use their services - they tend to call their clients **SDK's** which abstract away the underlying connection transport behind optimal native language bindings. This is especially evident from companies whose business relies on the popular use of their APIs, like [Amazon](http://aws.amazon.com/sdkfornet/), [Facebook](https://developers.facebook.com/docs/sdks/) and [Windows Azure](https://github.com/WindowsAzure). This is an especially good idea if you want to support static languages (i.e. C# and Java) where having typed client libraries saves end-users from reverse engineering the types and API calls. It also saves them having to look up documentation since a lot of it can be inferred from the type info. ServiceStack's and Amazons convention of having `ServiceName` and `ServiceNameResponse` for each service also saves users from continually checking documentation to work out what the response of each service will be. ### Packaging client libraries In terms of packaging your client libraries, sticking a link to a zip file on your Websites APIs documentation page would be the easiest approach. It would be better if the zip file was a link to a master archive of a Github repository as you'll be able to accept bug fixes and usability tips from the community. Finally we believe the best way to make your client libraries available would be to host them in the target languages native package manager - letting end-users issue 1-command to automatically add it to their project, and another to easily update it when your service has changed. ### NuGet is the new Add Service Reference For .NET this means adding it to NuGet. If you use ServiceStack your package would just need to contain your types with a reference to [ServiceStack.Client](http://nuget.org/packages/ServiceStack.Client) - which contains all ServiceStack's generic JSON, XML, JSV and SOAP 1.1/1.2 service clients. Add a reference to [ServiceStack.ProtoBuf](http://nuget.org/packages/ServiceStack.ProtoBuf) if you want to support Protocol Buffers as well. One of the benefits of using ServiceStack is that all your types are already created since it's what you used to define your web services with! --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ai-chat-analytics.md --- title: Admin UI Analytics for AI Chat --- ServiceStack's [AI Chat](/ai-chat-api) feature provides a unified API for integrating multiple AI providers into your applications. To gain visibility into usage patterns, costs, and performance across your AI infrastructure, the platform includes comprehensive chat history persistence and analytics capabilities. :::copy npx add-in chat ::: Or by referencing the **ServiceStack.AI.Chat** NuGet package and adding the `ChatFeature` plugin: ```csharp services.AddPlugin(new ChatFeature { EnableProviders = [ "servicestack", ] }); ``` ## AI Chat History Persistence Enabling chat history persistence allows you to maintain a complete audit trail of all AI interactions, track token consumption, monitor costs across providers and models, and analyze usage patterns over time that captures every request and response flowing through AI Chat's UI, external OpenAI endpoints and internal `IChatStore` requests. ### Database Storage Options ServiceStack provides two storage implementations to suit different deployment scenarios: `DbChatStore` - A universal solution that stores chat history in a single table compatible with any RDBMS [supported by OrmLite](/ormlite/getting-started): ```csharp services.AddSingleton<IChatStore,DbChatStore>(); ``` `PostgresChatStore` - An optimized implementation for PostgreSQL that leverages monthly table partitioning for improved query performance and data management: ```csharp services.AddSingleton<IChatStore, PostgresChatStore>(); ``` Both implementations utilize indexed queries with result limits to ensure consistent performance even as your chat history grows. The partitioned approach in PostgreSQL offers additional benefits for long-term data retention and archival strategies. ## Admin UI Analytics Once chat history persistence is enabled, the Admin UI provides comprehensive analytics dashboards that deliver actionable insights into your AI infrastructure. The analytics interface offers multiple views to help you understand costs, optimize token usage, and monitor activity patterns across all configured AI providers and models. The analytics dashboard includes three primary tabs: - **Cost Analysis** - Track spending across providers and models with daily and monthly breakdowns - **Token Usage** - Monitor input and output token consumption to identify optimization opportunities - **Activity** - Review detailed request logs with full conversation history and metadata These visualizations enable data-driven decisions about provider selection, model usage, and cost optimization strategies. ### Cost Analysis The Cost Analysis tab provides financial visibility into your AI operations with interactive visualizations showing spending distribution across providers and models. Daily cost trends help identify usage spikes, while monthly aggregations reveal long-term patterns. Pie charts break down costs by individual models and providers, making it easy to identify your most expensive AI resources and opportunities for cost optimization. :::{.wideshot} ![](/img/pages/ai-chat/admin-chat-costs.webp) ::: ### Token Usage The Token Usage tab tracks both input (prompt) and output (completion) tokens across all requests. Daily usage charts display token consumption trends over time, while model and provider breakdowns show which AI resources consume the most tokens. This granular visibility helps optimize prompt engineering, identify inefficient usage patterns, and forecast capacity requirements. :::{.wideshot} ![](/img/pages/ai-chat/admin-chat-tokens.webp) ::: ### Activity Log The Activity tab maintains a searchable log of all AI chat requests, displaying timestamps, models, providers, and associated costs. Clicking any request opens a detailed view showing the complete conversation including user prompts, AI responses, token counts, duration, and the full request payload. This audit trail is invaluable for debugging, quality assurance, and understanding how your AI features are being used in production. :::{.wideshot} ![](/img/pages/ai-chat/admin-chat-activity.webp) ::: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ai-chat-api.md --- title: AI Chat --- **AI Chat** is our refreshingly simple solution for integrating AI into your applications by unlocking the full value of the OpenAI Chat API. Unlike most other OpenAI SDKs and Frameworks, all of AI Chat's features are centered around arguably the most important API in our time - OpenAI's simple [Chat Completion API](https://platform.openai.com/docs/api-reference/chat) i.e. the primary API used to access Large Language Models (LLMs). ## Install AI Chat can be added to any .NET 10+ project by installing the **ServiceStack.AI.Chat** NuGet package and configuration with: :::copy npx add-in chat ::: Which drops this simple [Modular Startup](/modular-startup) that adds the `ChatFeature` and registers a link to its UI on the [Metadata Page](/metadata-page) if you want it: ```csharp public class ConfigureAiChat : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new ChatFeature()); services.ConfigurePlugin<MetadataFeature>(feature => { feature.AddPluginLink("/chat", "AI Chat"); }); }); } ``` #### Prerequisites: As AI Chat protects its APIs and UI with Identity Auth or API Keys, you'll need to enable the [API Keys Feature](/auth/apikeys) if you haven't already: :::sh npx add-in apikeys ::: ## Single Powerful API Your App logic needs only bind to a simple `IChatClient` interface that accepts a Typed `ChatCompletion` Request DTO and returns a Typed `ChatResponse` DTO: ```csharp public interface IChatClient { Task<ChatResponse> ChatAsync( ChatCompletion request, CancellationToken token=default); } ``` An impl-free easily substitutable interface for calling any OpenAI-compatible Chat API, using clean Typed `ChatCompletion` and `ChatResponse` DTOs. Unfortunately since the API needs to be typed and .NET Serializers don't have support for de/serializing union types yet, the DTO adopts OpenAI's more verbose and flexible multi-part Content Type which looks like: ```csharp IChatClient client = CreateClient(); var request = new ChatCompletion { Model = "gpt-5", Messages = [ new() { Role = "user", Content = [ new AiTextContent { Type = "text", Text = "Capital of France?" } ], } ] }; var response = await client.ChatAsync(request); ``` To improve the UX we've added a [Message.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/Message.cs) helper which encapsulates the boilerplate of sending **Text**, **Image**, **Audio** and **Files** into more succinct and readable code where you'd typically only need to write: ```csharp var request = new ChatCompletion { Model = "gpt-5", Messages = [ Message.SystemPrompt("You are a helpful assistant"), Message.Text("Capital of France?"), ] }; var response = await client.ChatAsync(request); string? answer = response.GetAnswer(); ``` ### Same ChatCompletion DTO, Used Everywhere That's all that's required for your internal App Logic to access your App's configured AI Models. However, as AI Chat also makes its own OpenAI Compatible API available, your external .NET Clients can use the **same exact DTO** to get the **same Response** by calling your API with a [C# Service Client](/csharp-client): ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = apiKey }; var response = await client.SendAsync(request); ``` ### Support for Text, Images, Audio & Files For Multi-modal LLMs which support it, you can also send Images, Audio & File attachments with your AI Request using **URLs**, e.g: ```csharp var image = new ChatCompletion { Model = "qwen2.5vl", Messages = [ Message.Image(imageUrl:"https://example.org/image.webp", text:"Describe the key features of the input image"), ] } var audio = new ChatCompletion { Model = "gpt-4o-audio-preview", Messages = [ Message.Audio(data:"https://example.org/speaker.mp3", text:"Please transcribe and summarize this audio file"), ] }; var file = new ChatCompletion { Model = "gemini-flash-latest", Messages = [ Message.File( fileData:"https://example.org/order.pdf", text:"Please summarize this document"), ] }; ``` #### Relative File Path If a [VirtualFiles Provider](/virtual-file-system) was configured, you can specify a relative path instead: ```csharp var image = new ChatCompletion { Model = "qwen2.5vl", Messages = [ Message.Image(imageUrl:"/path/to/image.webp", text:"Describe the key features of the input image"), ] }; ``` #### Manual Download & Embedding Alternatively you can embed and send the raw Base64 Data or Data URI yourself: ```csharp var bytes = await "https://example.org/image.webp".GetBytesFromUrlAsync(); var dataUri = $"data:image/webp;base64,{Convert.ToBase64String(bytes)}"; var image = new ChatCompletion { Model = "qwen2.5vl", Messages = [ Message.Image(imageUrl:dataUri, text:"Describe the key features of the input image"), ] }; ``` Although sending references to external resources allows keeping AI Requests payloads small, making them easier to store in Databases, send in MQs and client workflows, etc. This illustrates some of the "value-added" features of AI Chat where it will automatically download any URL Resources and embed it as Base64 Data in the `ChatCompletion` Request DTO. ### Configure Downloads Relative paths can be enabled by configuring a `VirtualFiles` Provider to refer to a safe path that you want to allow access to. Whilst URLs are downloaded by default, but its behavior can be customized with `ValidateUrl` or replaced entirely with `DownloadUrlAsBase64Async`: ```csharp services.AddPlugin(new ChatFeature { // Enable Relative Path Downloads VirtualFiles = new FileSystemVirtualFiles(assetDir), // Validate URLs before download ValidateUrl = url => { if (!IsAllowedUrl(url)) throw HttpError.Forbidden("URL not allowed"); }, // Use Custom URL Downloader // DownloadUrlAsBase64Async = async (provider, url) => { // var (base64, mimeType) = await MyDownloadAsync(url); // return (base64, mimeType); // }, }); ``` ## Configure AI Providers By default AI Chat is configured with a list of providers in its `llms.json` which is pre-configured with the best models from the leading LLM providers. The easiest way to use a custom `llms.json` is to add a local modified copy of [llms.json](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/chat/llms.json) to your App's `/wwwroot/chat` folder: ```files /wwwroot /chat llms.json ``` If you just need to change which providers are enabled you can specify them in `EnableProviders`: ```csharp services.AddPlugin(new ChatFeature { // Specify which providers you want to enable EnableProviders = [ "groq", "google_free", "codestral", "openrouter_free", "ollama", "google", "anthropic", "openai", "grok", "qwen", "z.ai", "mistral", "openrouter", "servicestack", ], // Use custom llms.json configuration ConfigJson = vfs.GetFile("App_Data/llms.json").ReadAllText(), }); ``` Alternatively you can use `ConfigJson` to load a custom JSON provider configuration from a different source, which you'll want to use if you prefer to keep your provider configuration and API Keys all in `llms.json`. ### llms.json - OpenAI Provider Configuration [llms.json](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/chat/llms.json) contains a list of OpenAI Compatible Providers you want to make available along with a user-defined **model alias** you want to use for model routing along with the provider-specific model name it maps to when the model is used with that provider, e.g: ```json { "providers": { "openrouter": { "enabled": false, "type": "OpenAiProvider", "base_url": "https://openrouter.ai/api", "api_key": "$OPENROUTER_API_KEY", "models": { "grok-4": "x-ai/grok-4", "glm-4.5-air": "z-ai/glm-4.5-air", "kimi-k2": "moonshotai/kimi-k2", "deepseek-v3.1:671b": "deepseek/deepseek-chat", "llama4:400b": "meta-llama/llama-4-maverick" } }, "anthropic": { "enabled": false, "type": "OpenAiProvider", "base_url": "https://api.anthropic.com", "api_key": "$ANTHROPIC_API_KEY", "models": { "claude-sonnet-4-0": "claude-sonnet-4-0" } }, "ollama": { "enabled": false, "type": "OllamaProvider", "base_url": "http://localhost:11434", "models": {}, "all_models": true }, "google": { "enabled": false, "type": "GoogleProvider", "api_key": "$GOOGLE_API_KEY", "models": { "gemini-flash-latest": "gemini-flash-latest", "gemini-flash-lite-latest": "gemini-flash-lite-latest", "gemini-2.5-pro": "gemini-2.5-pro", "gemini-2.5-flash": "gemini-2.5-flash", "gemini-2.5-flash-lite": "gemini-2.5-flash-lite" }, "safety_settings": [ { "category": "HARM_CATEGORY_DANGEROUS_CONTENT", "threshold": "BLOCK_ONLY_HIGH" } ], "thinking_config": { "thinkingBudget": 1024, "includeThoughts": true } }, //... } } ``` The only non-OpenAI Chat Provider AI Chat supports is `GoogleProvider`, where an exception was made to add explicit support for Gemini's Models given its low cost and generous free quotas. ### Provider API Keys API Keys can be either be specified within the `llms.json` itself, alternatively API Keys starting with `$` like `$GOOGLE_API_KEY` will first try to resolve it from `Variables` before falling back to checking Environment Variables. ```csharp services.AddPlugin(new ChatFeature { EnableProviders = [ "openrouter", "anthropic", "google", ], Variables = { ["OPENROUTER_API_KEY"] = secrets.OPENROUTER_API_KEY, ["ANTHROPIC_API_KEY"] = secrets.ANTHROPIC_API_KEY, ["GOOGLE_API_KEY"] = secrets.GOOGLE_API_KEY, } }); ``` ### Model Routing and Failover Providers are invoked in the order they're defined in `llms.json` that supports the requested model. If a provider fails, it tries the next available provider. This enables scenarios like: - Routing different request types to different providers - Optimize by Cost, Performance, Reliability, or Privacy - A/B testing different models - Added resilience with fallback when a provider is unavailable The model aliases don't need to identify a model directly, e.g. you could use your own artificial names for use-cases you need like `image-captioner`, `audio-transcriber`, `pdf-extractor` then map them to different models different providers should use to achieve the desired task. #### Use Model Routing with Fallback To make use of the model routing and fallback you would call `ChatAsync` on `IChatClient` directly: ```csharp class MyService(IChatClient client) { public async Task<object> Any(DefaultChat request) { return await client.ChatAsync(new ChatCompletion { Model = "glm-4.6", Messages = [ Message.Text(request.UserPrompt) ], }); } } ``` #### Use Specific Provider Alternatively to use a specific provider, you can use `IChatClients` dependency `GetClient(providerId)` method to resolve the provider then calling `ChatAsync` will only use that provider: ```csharp class MyService(IChatClients clients) { public async Task<object> Any(ProviderChat request) { var groq = clients.GetClient("groq"); return await groq.ChatAsync(new ChatCompletion { Model = "kimi-k2", Messages = [ Message.Text(request.UserPrompt) ], }); } } ``` ### Compatible with llms.py The other benefit of simple configuration and simple solutions, is that they're easy to implement. A perfect example of this being that this is the 2nd implementation done using this configuration. The same configuration, UI, APIs and functionality is also available in our [llms.py](https://github.com/ServiceStack/llms) Python CLI and server gateway we've developed in order to have a dependency-free LLM Gateway solution needed in our ComfyUI Agents. :::sh pip install llms-py ::: This also means you can use and test your own custom `llms.json` configuration on the command-line or in shell automation scripts: ```sh # Simple question llms "Explain quantum computing" # With specific model llms -m gemini-2.5-pro "Write a Python function to sort a list" # With system prompt llms -s "You are a helpful coding assistant" "Reverse a string in Python?" # With image (vision models) llms --image image.jpg "What's in this image?" llms --image https://example.com/photo.png "Describe this photo" # Display full JSON Response llms "Explain quantum computing" --raw # Start the UI and an OpenAI compatible API on port 8000: llms --serve 8000 ``` Incidentally as [llms.py UI](https://servicestack.net/posts/llms-py-ui) and AI Chat utilize the same UI you can use its **import/export** features to transfer your AI Chat History between them. Checkout the [llms.py GitHub repo](https://github.com/ServiceStack/llms) for even more features. ## FREE Gemini, Minimax M2, GLM 4.6, Kimi K2 in AI Chat To give AI Chat instant utility, we're making available a free `servicestack` OpenAI Chat provider that can be enabled with: ```csharp services.AddPlugin(new ChatFeature { EnableProviders = [ "servicestack", // "groq", // "google_free", // "openrouter_free", // "ollama", // "google", // "anthropic", // "openai", // "grok", // "qwen", // "z.ai", // "mistral", // "openrouter", ] }); ``` The `servicestack` provider is configured with a default `llms.json` which enables access to Gemini and the best value OSS models for FREE: ```json { "providers": { "servicestack": { "enabled": false, "type": "OpenAiProvider", "base_url": "http://okai.servicestack.com", "api_key": "$SERVICESTACK_LICENSE", "models": { "gemini-flash-latest": "gemini-flash-latest", "gemini-flash-lite-latest": "gemini-flash-lite-latest", "kimi-k2": "kimi-k2", "kimi-k2-thinking": "kimi-k2-thinking", "minimax-m2": "minimax-m2", "glm-4.6": "glm-4.6", "gpt-oss:20b": "gpt-oss:20b", "gpt-oss:120b": "gpt-oss:120b", "llama4:400b": "llama4:400b", "mistral-small3.2:24b": "mistral-small3.2:24b" } } } } ``` The `servicestack` provider requires the `SERVICESTACK_LICENSE` Environment Variable, although any ServiceStack License Key can be used, including expired and Free ones. :::{.not-prose} :::{.my-8 .max-w-3xl .mx-auto .rounded-lg .overflow-hidden .shadow .hover:shadow-xl} [![](/img/pages/ai-chat/llms-syntax.webp)](/ai-chat-ui) ::: ::: ### FREE for Personal Usage To be able to maintain this as a free service we're limiting usage for development or personal assistance and research by limiting usage to **60 requests /hour** which should be more than enough for most personal usage and research whilst deterring usage in automated tools or usage in production. :::tip info Rate limiting is implemented with a sliding [Token Bucket algorithm](https://en.wikipedia.org/wiki/Token_bucket) that replenishes 1 additional request every 60s ::: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ai-chat-custom-explorer-ui.md --- title: Custom Explorer UI for Chat --- The UX differences between [API Explorer](/api-explorer) and Swagger UI are more pronounced as APIs get larger and more complex which we can see by comparing it with Swagger UI for rendering [AI Chat's](/ai-chat-api) `ChatCompletion` API: [![](/img/pages/ai-chat/ai-chat-swagger-form.webp)](/img/pages/ai-chat/ai-chat-swagger-form.webp) The [full-length Swagger UI Screenshot](/img/pages/ai-chat/ai-chat-swagger-long.webp) shows that it's far from being a usable UI that you would want to present to your API Consumers. As expected from a generic UI we get very little assistance from the UI on what values are allowed, the numeric fields aren't number inputs and the only dropdowns we see are for `bool` properties to select from their `true` and `false` values. There's not going to be any chance for it to be able to show App-specific options like which models are currently enabled. ## API Explorer UI By contrast here is the same API rendered with ServiceStack's [API Explorer](/api-explorer): [![](/img/pages/ai-chat/ai-chat-form.webp)](/img/pages/ai-chat/ai-chat-form.webp) This is much closer to what you'd expect from a hand-crafted Application UI and far more usable. #### Properties use optimized UI Components It renders an optimized UI for each property, with the **Model**, **Reasoning Effort**, **Service Tier** and **Verbosity** properties all using a [Combobox](/vue/combobox) component for quickly searching through a list of supported options, or they can choose to enter a custom value. **Bool** properties use Checkboxes whilst Numeric fields use **number** inputs, with integer properties only allowing integer values and floating point properties being able to step through fractional values. #### UI-specific text hints Each property also contains **placeholder** text and **help** text hints that's more focused and concise than the verbose API documentation. #### HTML client-side validation Client-side HTML validation ensure properties are valid and within any configured min/max values before any request is sent. [![](/img/pages/ai-chat/ai-chat-form-completed.webp)](/img/pages/ai-chat/ai-chat-form-completed.webp) ### Custom Components for Complex Properties The only property that doesn't use a built-in component is `Messages` which is rendered with a custom `ChatMessages` component purpose-built to populate the `List<AiMessage> Messages` property. It uses a **Markdown Editor** for the UserPrompt, a collapsible Textarea for any System Prompt and the ability to attach **image**, **audio** & **file** document attachments to the API request. ## How is it done? The entire UI is driven by these [declarative annotations](/locode/declarative) added on the [ChatCompletion](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/ChatCompletion.cs) Request DTO: ```csharp [Description("Chat Completions API (OpenAI-Compatible)")] [Notes("The industry-standard, message-based interface for interfacing with Large Language Models.")] public class ChatCompletion : IPost, IReturn<ChatResponse> { [DataMember(Name = "messages")] [Input(Type = "ChatMessages", Label=""), FieldCss(Field = "col-span-12")] public List<AiMessage> Messages { get; set; } = []; [DataMember(Name = "model")] [Input(Type = "combobox", EvalAllowableValues = "Chat.Models", Placeholder = "e.g. glm-4.6", Help = "ID of the model to use")] public string Model { get; set; } [DataMember(Name = "reasoning_effort")] [Input(Type="combobox", EvalAllowableValues = "['low','medium','high','none','default']", Help = "Constrains effort on reasoning for reasoning models")] public string? ReasoningEffort { get; set; } [DataMember(Name = "service_tier")] [Input(Type = "combobox", EvalAllowableValues = "['auto','default']", Help = "Processing type for serving the request")] public string? ServiceTier { get; set; } [DataMember(Name = "safety_identifier")] [Input(Type = "text", Placeholder = "e.g. user-id", Help = "Stable identifier to help detect policy violations")] public string? SafetyIdentifier { get; set; } [DataMember(Name = "stop")] [Input(Type = "tag", Max = "4", Help = "Up to 4 sequences for the API to stop generating tokens")] public List<string>? Stop { get; set; } [DataMember(Name = "modalities")] [Input(Type = "tag", Max = "3", Help = "The output types you would like the model to generate")] public List<string>? Modalities { get; set; } [DataMember(Name = "prompt_cache_key")] [Input(Type = "text", Placeholder = "e.g. my-cache-key", Help = "Used by OpenAI to cache responses for similar requests")] public string? PromptCacheKey { get; set; } [DataMember(Name = "tools")] public List<Tool>? Tools { get; set; } [DataMember(Name = "verbosity")] [Input(Type = "combobox", EvalAllowableValues = "['low','medium','high']", Placeholder = "e.g. low", Help = "Constrains verbosity of model's response")] public string? Verbosity { get; set; } [DataMember(Name = "temperature")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "2", Placeholder = "e.g. 0.7", Help = "Higher values more random, lower for more focus")] public double? Temperature { get; set; } [DataMember(Name = "max_completion_tokens")] [Input(Type = "number", Value = "2048", Step = "1", Min = "1", Placeholder = "e.g. 2048", Help = "Max tokens for completion (inc. reasoning tokens)")] public int? MaxCompletionTokens { get; set; } [DataMember(Name = "top_logprobs")] [Input(Type = "number", Step = "1", Min = "0", Max = "20", Placeholder = "e.g. 5", Help = "Number of most likely tokens to return with log probs")] public int? TopLogprobs { get; set; } [DataMember(Name = "top_p")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "1", Placeholder = "e.g. 0.5", Help = "Nucleus sampling - alternative to temperature")] public double? TopP { get; set; } [DataMember(Name = "frequency_penalty")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "2", Placeholder = "e.g. 0.5", Help = "Penalize tokens based on frequency in text")] public double? FrequencyPenalty { get; set; } [DataMember(Name = "presence_penalty")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "2", Placeholder = "e.g. 0.5", Help = "Penalize tokens based on presence in text")] public double? PresencePenalty { get; set; } [DataMember(Name = "seed")] [Input(Type = "number", Placeholder = "e.g. 42", Help = "For deterministic sampling")] public int? Seed { get; set; } [DataMember(Name = "n")] [Input(Type = "number", Placeholder = "e.g. 1", Help = "How many chat choices to generate for each input message")] public int? N { get; set; } [Input(Type = "checkbox", Help = "Whether or not to store the output of this chat request")] [DataMember(Name = "store")] public bool? Store { get; set; } [DataMember(Name = "logprobs")] [Input(Type = "checkbox", Help = "Whether to return log probabilities of the output tokens")] public bool? Logprobs { get; set; } [DataMember(Name = "parallel_tool_calls")] [Input(Type = "checkbox", Help = "Enable parallel function calling during tool use")] public bool? ParallelToolCalls { get; set; } [DataMember(Name = "enable_thinking")] [Input(Type = "checkbox", Help = "Enable thinking mode for some Qwen providers")] public bool? EnableThinking { get; set; } [DataMember(Name = "stream")] [Input(Type = "hidden")] public bool? Stream { get; set; } } ``` Which uses the [[Input] attribute](/locode/declarative#custom-fields-and-inputs) to control the HTML Input rendered for each property whose `Type` can reference any HTML Input or any [ServiceStack Vue Component](/vue/form-inputs) that's either built-in or registered with the Component library. In addition, you also have control to the css of the containing **Field**, **Input** and **Label** elements with the [[FieldCss] attribute](/locode/declarative#field) which uses `[FieldCss(Field="col-span-12")]` to render the field to span the full width of the form. The `[Input(Type="hidden")]` is used to hide the `Stream` property from the UI since it is invalid from an API Explorer UI. ### Combobox Values The Combobox `EvalAllowableValues` can reference any JavaScript expression which is evaluated with [#Script](https://sharpscript.net) with the results embedded in the API Metadata that API Explorer uses to render its UI. All combo boxes references a static JS Array except for `Model` which uses `EvalAllowableValues = "Chat.Models"` to invoke the registered `Chat` instance `Models` property which returns an ordered list of all available models from all enabled providers: ```csharp appHost.ScriptContext.Args[nameof(Chat)] = new Chat(this); public class Chat(ChatFeature feature) { public List<string> Models => feature.Providers.Values .SelectMany(x => x.Models.Keys) .Distinct() .OrderBy(x => x) .ToList(); } ``` ### Custom ChatMessages Component The only property that doesn't use a built-in component is: ```csharp [Input(Type = "ChatMessages", Label=""), FieldCss(Field = "col-span-12")] public List<AiMessage> Messages { get; set; } = []; ``` Which makes use of a custom `ChatMessages` component in [/modules/ui/components/ChatMessages.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/modules/ui/components/ChatMessages.mjs). Custom Components can be added to API Explorer in the same way as [overriding any built-in API Explorer](/locode/custom-overview#ui) component by adding it to your local `/wwwroot` folder: ```files /modules /ui /components ChatMessages.mjs ``` All components added to the `/components` folder will be automatically registered and available for use. That's all that's needed to customize the `ChatCompletion` Form UI in API Explorer, for more features and customizations see the [API Explorer Docs](/api-explorer). --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ai-chat-ui.md --- title: AI Chat UI --- A major value proposition of [AI Chat](/ai-chat-api) is being able to offer a ChatGPT-like UI to your users where you're able to control the API Keys, billing, and sanctioned providers your users can access to maintain your own **Fast, Local, and Private** access to AI from within your own organization. ### Identity Auth or Valid API Key AI Chat makes of ServiceStack's new [API Keys or Identity Auth APIs](https://servicestack.net/posts/apikey_auth_apis) which allows usage for both Authenticated Identity Auth users otherwise unauthenticated users will need to provide a valid API Key: :::{.shadow} [![](/img/pages/ai-chat/ai-chat-ui-apikey.webp)](/img/pages/ai-chat/ai-chat-ui-apikey.webp) ::: If needed `ValidateRequest` can be used to further restrict access to AI Chat's UI and APIs, e.g. you can restrict access to API Keys with the `Admin` scope with: ```csharp services.AddPlugin(new ChatFeature { ValidateRequest = async req => req.GetApiKey()?.HasScope(RoleNames.Admin) == true ? null : HttpResult.Redirect("/admin-ui"), }); ``` ### Import / Export All data is stored locally in the users local browser's IndexedDB. When needed you can backup and transfer your entire chat history between different browsers using the **Export** and **Import** features on the home page. :::{.wideshot} [![llms-home.webp](/img/pages/ai-chat/llms-home.webp)](/img/pages/ai-chat/llms-home.webp) ::: ## Simple and Flexible UI Like all of [ServiceStack's built-in UIs](https://servicestack.net/auto-ui), AI Chat is also [naturally customizable](/locode/custom-overview) where you can override any of [AI Chat's Vue Components](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack/src/ServiceStack.AI.Chat/chat) and override them with your own by placing them in your [/wwwroot/chat](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack/tests/AdhocNew/wwwroot/chat) folder: ```files /wwwroot /chat Brand.mjs Welcome.mjs ``` Where you'll be able to customize the appearance and behavior of AI Chat's UI to match your App's branding and needs. :::{.wideshot} [![](/img/pages/ai-chat/ai-chat-custom-ui.webp)](/img/pages/ai-chat/ai-chat-custom-ui.webp) ::: ## Customize The built-in [ui.json](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/chat/ui.json) configuration can be overridden with your own to use your preferred system prompts and other defaults by adding them to your local folder: ```files /wwwroot /chat llms.json ui.json ``` Alternatively `ConfigJson` and `UiConfigJson` can be used to load custom JSON configuration from a different source, e.g: ```csharp services.AddPlugin(new ChatFeature { // Use custom llms.json configuration ConfigJson = vfs.GetFile("App_Data/llms.json").ReadAllText(), // Use custom ui.json configuration UiConfigJson = vfs.GetFile("App_Data/ui.json").ReadAllText(), }); ``` ## Rich Markdown & Syntax Highlighting To maximize readability there's full support for Markdown and Syntax highlighting for the most popular programming languages. :::{.wideshot} [![llms-syntax.webp](/img/pages/ai-chat/llms-syntax.webp)](/img/pages/ai-chat/llms-syntax.webp) ::: To quickly and easily make use of AI Responses, **Copy Code** icons are readily available on hover of all messages and code blocks. ## Rich, Multimodal Inputs The Chat UI goes beyond just text and can take advantage of the multimodal capabilities of modern LLMs with support for Image, Audio, and File inputs. ### 🖼️ 1. Image Inputs & Analysis Images can be uploaded directly into your conversations with vision-capable models for comprehensive image analysis. Visual AI Responses are highly dependent on the model used. This is a typical example of the visual analysis provided by the latest Gemini Flash of our [ServiceStack Logo](/img/logo.png): :::{.wideshot} [![llms-image.webp](/img/pages/ai-chat/llms-image.webp)](/img/pages/ai-chat/llms-image.webp) ::: ### 🎤 2. Audio Input & Transcription Likewise you can upload Audio files and have them transcribed and analyzed by multi-modal models with audio capabilities. :::{.wideshot} [![llms-audio.webp](/img/pages/ai-chat/llms-audio.webp)](/img/pages/ai-chat/llms-audio.webp) ::: Example of processing audio input. Audio files can be uploaded with system and user prompts to instruct the model to transcribe and summarize its content where its multi-modal capabilities are integrated right within the chat interface. ### 📎 3. File and PDF Attachments In addition to images and audio, you can also upload documents, PDFs, and other files to capable models to extract insights, summarize content or analyze. **Document Processing Use Cases:** - **PDF Analysis**: Upload PDF documents for content extraction and analysis - **Data Extraction**: Extract specific information from structured documents - **Document Summarization**: Get concise summaries of lengthy documents - **Query Content**: Ask questions about specific content in documents - **Batch Processing**: Upload multiple files for comparative analysis Perfect for research, document review, data analysis, and content extractions. :::{.wideshot} [![llms-files.webp](/img/pages/ai-chat/llms-files.webp)](/img/pages/ai-chat/llms-files.webp) ::: ## Custom AI Chat Requests Send Custom Chat Completion requests through the settings dialog, allowing Users to fine-tune their AI requests with advanced options including: - **Temperature** `(0-2)` for controlling response randomness - **Max Completion Tokens** to limit response length - **Seed** values for deterministic sampling - **Top P** `(0-1)` for nucleus sampling - **Frequency** & **Presence Penalty** `(-2.0 to 2.0)` for reducing repetition - **Stop** Sequences to control where the API stops generating - **Reasoning Effort** constraints for reasoning models - **Top Logprobs** `(0-20)` for token probability analysis - **Verbosity** settings :::{.wideshot} [![llms-settings.webp](/img/pages/ai-chat/llms-settings.webp)](/img/pages/ai-chat/llms-settings.webp) ::: ## Enable / Disable Providers **Admin** Users can manage which providers they want enabled or disabled at runtime. Providers are invoked in the order they're defined in `llms.json` that supports the requested model. If a provider fails, it tries the next available one. By default `llms.json` defines providers with Free tiers first, followed by local providers and then premium cloud providers which can all be enabled or disabled from the UI: :::{.wideshot} [![llms-providers.webp](/img/pages/ai-chat/llms-providers.webp)](/img/pages/ai-chat/llms-providers.webp) ::: ## Search History Quickly find past conversations with built-in search: :::{.wideshot} [![llms-search-python.webp](/img/pages/ai-chat/llms-search-python.webp)](/img/pages/ai-chat/llms-search-python.webp) ::: ## Smart Autocomplete for Models & System Prompts Autocomplete components are used to quickly find and select the preferred model and system prompt. Only models from enabled providers will appear in the drop down, which will be available immediately after providers are enabled. :::{.wideshot} [![llms-autocomplete.webp](/img/pages/ai-chat/llms-autocomplete.webp)](/img/pages/ai-chat/llms-autocomplete.webp) ::: ## Comprehensive System Prompt Library Access a curated collection of 200+ professional system prompts designed for various use cases, from technical assistance to creative writing. :::{.wideshot} [![llms-system-prompt.webp](/img/pages/ai-chat/llms-system-prompt.webp)](/img/pages/ai-chat/llms-system-prompt.webp) ::: System Prompts be can added, removed & sorted in your `ui.json` ```json { "prompts": [ { "id": "it-expert", "name": "Act as an IT Expert", "value": "I want you to act as an IT expert. You will be responsible..." }, ... ] } ``` ### Reasoning Access the thinking process of advanced AI models with specialized rendering for reasoning and chain-of-thought responses: :::{.wideshot} [![llms-reasoning.webp](/img/pages/ai-chat/llms-reasoning.webp)](/img/pages/ai-chat/llms-reasoning.webp) ::: We're excited to get AI Chat in customers hands. Please [let us know](https://servicestack.net/ideas) of any other missing features you'd love to see implemented. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/vue/alerts.md --- title: Alert Components group: Component Gallery --- <api-reference component="Alert"></api-reference> ## Alert <p class="mb-4 text-lg"> Show basic alert message: </p> ```html <Alert>Default <b>Message</b></Alert> <Alert type="info">Information <b>Message</b></Alert> <Alert type="success">Success <b>Message</b></Alert> <Alert type="warn">Warning <b>Message</b></Alert> <Alert type="error">Error <b>Message</b></Alert> ``` <div class="not-prose mb-4"> <alert>Default <b>Message</b></alert> <alert type="info">Information <b>Message</b></alert> <alert type="success">Success <b>Message</b></alert> <alert type="warn">Warning <b>Message</b></alert> <alert type="error">Error <b>Message</b></alert> </div> Show alert message from dynamic HTML string: ```html <Alert v-html="message" /> <script> const message = "Requires <b>Employee</b> Role" </script> ``` <div class="not-prose"> <alert v-html="message"></alert> </div> <api-reference component="AlertSuccess"></api-reference> ## Alert Success Show success alert message: ```html <AlertSuccess>Order was received</AlertSuccess> ``` <div class="not-prose"> <alert-success class="not-prose">Order was received</alert-success> </div> <api-reference component="ErrorSummary"></api-reference> ## Error Summary Show failed Summary API Error Message: ```html <ErrorSummary :status="{ message:'Requires Employee Role' }" /> ``` <div class="not-prose"> <error-summary :status="{ message:'Requires Employee Role' }" class="not-prose"></error-summary> </div> --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/amazon-sqs-mq.md --- slug: amazon-sqs-mq title: Amazon SQS MQ --- ## Enable in an existing Web App Use the `sqs` mixin to register an [MQ Server](/messaging) for Amazon SQS with an existing .NET App: :::sh npx add-in sqs ::: ## Worker Service Template To start using Amazon SQS in stand-alone MQ Servers (i.e. without HTTP access) is to run the MQ Server in an ASP.NET Core Worker Service by starting from a pre-configured project template: <worker-templates template="worker-sqs"></worker-templates> ## Manual Configuration Support for registering Amazon Simple Queue Service (SQS) as an [MQ Server](/messaging) is available in [ServiceStack.Aws](https://www.nuget.org/packages/ServiceStack.Aws) NuGet package: :::copy `<PackageReference Include="ServiceStack.Aws" Version="10.*" />` ::: Once installed SQS can be configured the same way as any other [MQ Servers](/messaging), by first registering the ServiceBus `IMessageService` provider followed by registering all ServiceStack Services you want to be able to invoke via MQ’s: ```csharp container.Register<IMessageService>(c => new SqsMqServer( AwsConfig.AwsAccessKey, AwsConfig.AwsSecretKey, RegionEndpoint.USEast1) { DisableBuffering = true, // Trade-off latency vs efficiency }); var mqServer = container.Resolve<IMessageService>(); mqServer.RegisterHandler<MyRequest>(ExecuteMessage); AfterInitCallbacks.Add(appHost => mqServer.Start()); ``` When an MQ Server is registered, ServiceStack automatically publishes Requests accepted on the "One Way" [pre-defined route](/routing#pre-defined-routes) to the registered MQ broker. The message is later picked up and executed by a Message Handler on a background Thread. ## SQS MQ Server Example The [AWS Email Contacts](https://github.com/ServiceStackApps/AwsApps/tree/master/src/AwsApps/emailcontacts) example shows the same long-running [EmailContact Service](https://github.com/ServiceStackApps/AwsApps/blob/4817f5c6ad69defd74d528403bfdb03e5958b0b3/src/AwsApps/emailcontacts/EmailContactServices.cs#L81) being executed from both HTTP and MQ Server by just [changing which url the HTML Form is posted to](https://github.com/ServiceStackApps/AwsApps/blob/4817f5c6ad69defd74d528403bfdb03e5958b0b3/src/AwsApps/emailcontacts/default.cshtml#L203): ```html //html <form id="form-emailcontact" method="POST" action="@(new EmailContact().ToPostUrl())" data-action-alt="@(new EmailContact().ToOneWayUrl())"> ... <div> <input type="checkbox" id="chkAction" data-click="toggleAction" /> <label for="chkAction">Email via MQ</label> </div> ... </form> ``` > The urls are populated from a typed Request DTO using the [Reverse Routing Extension methods](/routing#reverse-routing) Checking the **Email via MQ** checkbox fires the JavaScript handler below that's registered as [declarative event in ss-utils.js](/ss-utils-js#declarative-events): ```js $(document).bindHandlers({ toggleAction: function() { var $form = $(this).closest("form"), action = $form.attr("action"); $form.attr("action", $form.data("action-alt")) .data("action-alt", action); } }); ``` The code to configure and start an SQS MQ Server is similar to [other MQ Servers](/messaging): ```csharp container.Register<IMessageService>(c => new SqsMqServer( AwsConfig.AwsAccessKey, AwsConfig.AwsSecretKey, RegionEndpoint.USEast1) { DisableBuffering = true, // Trade-off latency vs efficiency }); var mqServer = container.Resolve<IMessageService>(); mqServer.RegisterHandler<EmailContacts.EmailContact>(ExecuteMessage); AfterInitCallbacks.Add(appHost => mqServer.Start()); ``` ## Intercepting Filters A number of new filters are available on `SqsMqServer` and `SqsMqClient` which will let you intercept and apply custom logic before SQS messages are sent and received: ```csharp Action<SendMessageRequest,IMessage> SendMessageRequestFilter Action<ReceiveMessageRequest> ReceiveMessageRequestFilter Action<Amazon.SQS.Model.Message, IMessage> ReceiveMessageResponseFilter Action<DeleteMessageRequest> DeleteMessageRequestFilter Action<ChangeMessageVisibilityRequest> ChangeMessageVisibilityRequestFilter ``` ## Polling Duration The polling duration used to poll SQS queues can be configured with: ```csharp new SqsMqServer { PollingDuration = TimeSpan.FromMilliseconds(1000) //default } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/grpc/android.md --- slug: grpc-android title: gRPC protoc Android Java Client --- [![](https://raw.githubusercontent.com/NetCoreApps/todo-world/master/TodoWorld/wwwroot/assets/img/android/android-grpc-ssl.png)](https://youtu.be/nag0hr5THug) <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="nag0hr5THug" style="background-image: url('https://img.youtube.com/vi/nag0hr5THug/maxresdefault.jpg')"></lite-youtube> ## Android Java-Lite protoc generated GrpcServiceClient Example This Android gRPC Example differentiates from the [Java gRPC Example](#java) by using the more appropriate [Java Lite](https://github.com/protocolbuffers/protobuf/blob/master/java/lite.md) which results in a much smaller code size making it more suitable to be used on embedded Java platforms like Android. In addition it uses the Android-compatible OK HTTP SSL Libraries in-place of Netty's SSL libraries and a custom `services.proto` allow us to specify the Java **package** we want the generated gRPC client to use. Install [x dotnet tool](/dotnet-tool): :::sh dotnet tool install --global x ::: Create a new Android App with [Android Studio](https://developer.android.com/studio): ![](https://raw.githubusercontent.com/NetCoreApps/todo-world/master/TodoWorld/wwwroot/assets/img/android/new-android-project.png) Add protoc generated TodoWorld DTOs and gRPC `GrpcServicesStub`: ```bash cd app\src\main\java x proto-java https://todoworld.servicestack.net ``` Modify the downloaded `services.proto` to use the **package** name of your App, e.g: ```protobuf option java_package = "net.servicestack.androidapp"; ``` Generate a **java-lite** protoc gRPC client from your modified `services.proto`: :::sh x proto-java-lite services.proto ::: Update **build.gradle** with required gRPC, protobuf and OK HTTP plugins and dependencies: ```groovy plugins { id 'com.google.protobuf' version '0.8.8' id 'idea' } //... def grpcVersion = '1.27.0' dependencies { implementation 'javax.annotation:javax.annotation-api:1.2' implementation "io.grpc:grpc-protobuf:${grpcVersion}" implementation "io.grpc:grpc-auth:${grpcVersion}" implementation "io.grpc:grpc-census:${grpcVersion}" implementation "io.grpc:grpc-okhttp:${grpcVersion}" implementation "io.grpc:grpc-stub:${grpcVersion}" //... } ``` Sync changes to your **build.gradle** to install the new dependencies: ![](https://raw.githubusercontent.com/NetCoreApps/todo-world/master/TodoWorld/wwwroot/assets/img/android/gradle-sync.png) Add the `android.permission.INTERNET` to your **AndroidManifest.xml** (before `<application/>` tag): ```xml <uses-permission android:name="android.permission.INTERNET"/> ``` Use protoc generated DTOs and async `GrpcServicesStub` to perform non-blocking TodoWorld gRPC Service requests: ### Android Java protoc gRPC insecure Example ```java ManagedChannel channel = ManagedChannelBuilder.forAddress( "todoworld.servicestack.net", 50054).usePlaintext().build(); final GrpcServicesGrpc.GrpcServicesStub client = GrpcServicesGrpc.newStub(channel); fab.setOnClickListener(new View.OnClickListener() { client.getHello(Services.Hello.newBuilder() .setName("Android gRPC").build(), new StreamObserver<Services.HelloResponse>() { @Override public void onNext(Services.HelloResponse value) { Snackbar.make(view, value.getResult(), Snackbar.LENGTH_LONG) .setAction("Action", null).show(); } @Override public void onError(Throwable t) {} @Override public void onCompleted() {} }); }); ``` Now run your App and click the Action button to make a plain-text gRPC Request: ![](https://raw.githubusercontent.com/NetCoreApps/todo-world/master/TodoWorld/wwwroot/assets/img/android/android-grpc-insecure.png) ### Android Java protoc gRPC SSL Example To use gRPC SSL we'll need a copy of our gRPC's Service SSL Certificate which we can make available to our Android App by saving it to our App's `assets` directory: ```bash mkdir ..\assets x get https://todoworld.servicestack.net/grpc.crt -out ../assets ``` Building an TLS Channel configured with a self-signed SSL Certificate requires a bit of effort with OK HTTP so we'll include a [ChannelBuilder.java](https://gist.github.com/gistlyn/0a3311c1b72b136bdfae616507cc38af) to wrap up the boiler plate: :::sh npx add-in grpc-android ::: This simplifies the configuration required down to just the `grpc.crt` certificate loaded from the App's Asset Manager, the host and port name of the gRPC SSL Channel: ```java ManagedChannel channel = null; InputStream is = null; try { is = getResources().getAssets().open("grpc.crt"); channel = ChannelBuilder.buildTls( "todoworld.servicestack.net", 50051, is); is.close(); } catch (Throwable e) { e.printStackTrace(); } ``` Lets update the gRPC API call to reflect we're now using an SSL channel: ```java client.getHello(Services.Hello.newBuilder() .setName("gRPC SSL").build(), ``` Now after re-running our App it'll perform gRPC SSL Service requests instead: ![](https://raw.githubusercontent.com/NetCoreApps/todo-world/master/TodoWorld/wwwroot/assets/img/android/android-grpc-ssl.png) Refer to [/mobile/java/AndroidGrpc](https://github.com/NetCoreApps/todo-world/tree/master/mobile/java/AndroidGrpc) for a complete example project. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/templates/angular-bootstrap.md --- title: Angular Bootstrap Project Template --- <div class="not-prose my-8 ml-20 flex justify-center"><svg style="max-width:200px" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 250 250" xml:space="preserve"> <g> <polygon fill="#DD0031" points="125,30 125,30 125,30 31.9,63.2 46.1,186.3 125,230 125,230 125,230 203.9,186.3 218.1,63.2 "/> <polygon fill="#C3002F" points="125,30 125,52.2 125,52.1 125,153.4 125,153.4 125,230 125,230 203.9,186.3 218.1,63.2 125,30 "/> <path fill="#FFFFFF" d="M125,52.1L66.8,182.6h0h21.7h0l11.7-29.2h49.4l11.7,29.2h0h21.7h0L125,52.1L125,52.1L125,52.1L125,52.1 L125,52.1z M142,135.4H108l17-40.9L142,135.4z"/> </g> </svg></div> [Angular](https://angular.io) is the premier JavaScript framework developed by Google for building applications that live on the web, mobile, or the desktop. ## Angular .NET Core and .NET Framework Single Page App Templates The templates below have been bootstrapped with the latest angular-cli tooling that's [seamlessly integrated](/templates/single-page-apps#end-to-end-typed-apis) into ServiceStack's [Recommended Physical Project Structure](/physical-project-structure). See the documentation in each project for more info on features of each template: ### [Angular 15 SPA Template](https://github.com/LegacyTemplates/angular-spa) .NET 8.0 Angular 15 project generated with [Angular CLI](https://github.com/angular/angular-spa). ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/angular-spa-bootstrap.png) > Browse [source code](https://github.com/LegacyTemplates/angular-spa) and install with [x new](/web-new): Create new Angular Project for .NET 8.0: :::sh x new LegacyTemplates/angular-spa ProjectName ::: Create new Angular Project for .NET Framework: :::sh x new angular-spa-netfx ProjectName ::: #### Angular HTTP Client The Angular template uses Angular's built-in Rx-enabled HTTP Client with ServiceStack's ambient TypeScript declarations, as it's often preferable to utilize Angular's built-in dependencies when available. ServiceStack's ambient TypeScript interfaces are leveraged to enable a Typed API, whilst the `createUrl(route,args)` helper lets you reuse your APIs Route definitions (emitted in comments above each Request DTO) to provide a pleasant UX for making API calls using Angular's HTTP Client: ```ts import { createUrl } from '@servicestack/client'; ... this.http.get<HelloResponse>(createUrl('/hello/{Name}', { name })).subscribe(r => { this.result = r.result; }); ``` # Angular Examples ## [TechStacks](https://github.com/ServiceStackApps/TechStacks) TechStacks is an AngularJS App that lets you explore TechStacks of popular StartUps using your favorite technology [![TechStacks](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/livedemos/techstacks/screenshots/techstacks.png)](https://github.com/ServiceStackApps/TechStacks) #### Features TechStacks is based on a [Bootstrap template](http://getbootstrap.com) with client-side features: - HTML5 Routing to enable pretty urls, also supports full page reloads and back button support - Same Services supporting both human-readable Slugs or int primary keys - Responsive design supporting iPad Landscape and Portrait modes - Preloading and background data fetching to reduce flicker and maximize responsiveness - [Disqus](https://disqus.com/) commenting system - [Chosen](http://harvesthq.github.io/chosen/) for UX-friendly multi combo boxes and some of TechStacks back-end features include: - [SEO-optimized, Server HTML generated, read-only version of the website](https://techstacks.io/?html=server) - Dynamically generated [sitemaps.xml](https://techstacks.io/sitemap.xml) - Page-level Locking - Record and Restore Page Content Versioning - [Twitter and GitHub OAuth Providers](/auth/authentication-and-authorization) - Substitutable [OrmLite](https://github.com/ServiceStack/ServiceStack.OrmLite) RDBMS [PostgreSQL and Sqlite](https://github.com/ServiceStackApps/TechStacks/blob/875e78910e43d2230f0925b71d5990497216511e/src/TechStacks/TechStacks/AppHost.cs#L49-L56) back-ends - [Auto Query](/autoquery/) for automatic services of RDBMS tables - [RDBMS Sessions and In Memory Caching](/caching) - [Fluent Validation](/validation) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/templates/angular.md --- title: .NET 10 Angular 21 Tailwind Template --- We're excited to announce the release of our new **Angular 21 SPA Template** - a modern, full-stack template combining the latest Angular 21 frontend with a powerful .NET 10 backend powered by ServiceStack. <vibe-template template="angular-spa" title="Angular SPA" description="Minimal foundation for a Single Page Application (SPA) statically generated by Angular. Perfect for dashboards, internal tools, admin panels, and highly interactive apps where SEO isn't a priority." href="https://angular-spa.web-templates.io" screenshot="/img/pages/templates/angular-spa.webp" github-template="https://github.com/new?template_name=angular-spa&template_owner=NetCoreTemplates"></vibe-template> ## What's New ### Angular with Modern Features - **Standalone Components** - No NgModules, cleaner component architecture - **Signal-based State Management** - Reactive state with Angular's new signals API - **TypeScript 5.9** - Latest TypeScript features and improved type safety - **Tailwind CSS 4** - Utility-first styling with dark mode support ### .NET 10 Backend - **ServiceStack v10** - High-performance .NET APIs with AutoQuery CRUD - **Entity Framework Core 10** - For ASP.NET Core Identity - **OrmLite** - Fast, typed POCO ORM for application data - **SQLite** - Zero-configuration database (easily swap for PostgreSQL, SQL Server, etc.) ### Upgrading to an production RDBMS To switch from SQLite to PostgreSQL/SQL Server/MySQL: 1. Install preferred RDBMS (`ef-postgres`, `ef-mysql`, `ef-sqlserver`), e.g: :::sh {.mb-8} npx add-in ef-postgres ::: 2. Install `db-identity` to also switch to use this RDBMS for [Background Jobs](/rdbms-background-jobs) and [Request Logs Analytics](/admin-ui-rdbms-analytics): :::sh {.mb-8} npx add-in db-identity ::: ## Simplified .NET + Angular Development Workflow - Single endpoint `https://localhost:5001` for both .NET and Angular UI (no dev certs required) - ASP.NET Core proxies requests to Angular dev server (port 4200) - Hot Module Replacement (HMR) support for instant UI updates - WebSocket proxying for Angular HMR functionality ![](/img/pages/templates/angular-dev.svg) ## .NET Angular App with Static Export **Angular SPA** uses **static export**, where a production build of the Angular App is generated at deployment and published together with the .NET App in its `/wwwroot` folder, utilizing static file serving to render its UI. This minimal `angular-spa` starting template is perfect for your next AI Assisted project, offering a streamlined foundation for building modern web applications with **Angular 21** and **.NET 10**: ![](/img/pages/templates/static-prod.svg) ## Key Features ### 🔐 ASP.NET Core Identity Authentication Full authentication system with beautifully styled Tailwind CSS pages: - User registration and login - Email confirmation - Password reset - Profile management - Role-based authorization ### ⚡ Rapid AutoQuery CRUD dev workflow Quickly generate complete C# [CRUD APIs](/autoquery/crud) and [DB Migrations](/ormlite/db-migrations) from simple [TypeScript data models](https://localhost:5002/autoquery/okai-models): 1. Create a new feature :::sh npx okai init MyFeature ::: 2. Define your TypeScript data models in `MyFeature.d.ts`, e.g: :::sh code MyApp.ServiceModel/MyFeature.d.ts ::: 3. When ready, generate C# APIs and migrations :::sh npx okai MyFeature.d.ts ::: 4. Apply database migrations :::sh npm run migrate ::: ### Use AI for quick scaffolding To help quickly scaffold your data models and features, use ServiceStack's AI assistant. Example of creating AutoQuery CRUD APIs for managing products: :::sh npx okai "Manage products price and inventory" ::: ### 📊 Background Jobs Durable background job processing with: - Command-based job execution - Recurring job scheduling - SMTP email sending via background workers ### 📝 Request Logging SQLite-backed request logging for: - API request tracking - Error monitoring - Performance analysis ### 🔍 Built-in Admin UIs - **/ui** - ServiceStack API Explorer - **/admin-ui** - Database management, user administration - **/swagger** - OpenAPI documentation (development mode) ## Architecture Highlights ### Hybrid Development Model During development, `dotnet watch` starts both the .NET backend and Angular dev server with Hot Module Replacement. In production, Angular builds to static files served directly by ASP.NET Core. ### Modular Configuration Clean separation of concerns with `IHostingStartup` pattern: - [Configure.AppHost.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Configure.AppHost.cs) - Main ServiceStack AppHost registration - [Configure.Auth.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Configure.Auth.cs) - ServiceStack AuthFeature with ASP.NET Core Identity integration - [Configure.AutoQuery.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Configure.AutoQuery.cs) - AutoQuery features and audit events - [Configure.Db.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Configure.Db.cs) - Database setup (OrmLite for app data, EF Core for Identity) - [Configure.Db.Migrations.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Configure.Db.Migrations.cs) - Runs OrmLite and EF DB Migrations and creates initial users - [Configure.BackgroundJobs.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Configure.BackgroundJobs.cs) - Background job processing - [Configure.HealthChecks.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Configure.HealthChecks.cs) - Health monitoring endpoint This pattern keeps [Program.cs](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp/Program.cs) clean and separates concerns. ### Type-Safe API Client Auto-generated TypeScript DTOs ensure type safety across the stack: ```typescript import { QueryBookings } from '@/dtos' const response = await client.api(new QueryBookings({ minCost: 100 })) if (response.succeeded) { console.log(response.response!.results) } ``` ## Deployment Ready GitHub Actions workflows included for: - **CI/CD** - Automated build and test - **Container Builds** - Docker image creation - **Kamal Deployment** - One-command production deployment with SSL ### Kamal Deployments All deployments include the GitHub Action workflows to deploy your App to [any Linux Server with Kamal](/kamal-deploy) using Docker, SSH and GitHub Container Registry (ghcr). Where you can host it on a [Hetzner US Cloud](https://www.hetzner.com/cloud) VM for as low as **$5 per month** or if you have multiple Apps you can delpoy them all to a single VM which we're doing for our .NET Template Live Demos which runs **30 Docker Apps** on a **8GB RAM/80GB SSD** dedicated VM for **$15 /month**. ## AI-Assisted Development with CLAUDE.md As part of our objectives of improving developer experience and embracing modern AI-assisted development workflows - all new .NET React templates include a comprehensive `AGENTS.md` file designed to optimize AI-assisted development workflows. ### What is CLAUDE.md? `CLAUDE.md` and [AGENTS.md](https://agents.md) onboards Claude (and other AI assistants) to your codebase by using a structured documentation file that provides it with complete context about your project's architecture, conventions, and technology choices. This enables more accurate code generation, better suggestions, and faster problem-solving. ### What's Included Each template's `AGENTS.md` contains: - **Project Architecture Overview** - Technology stack, design patterns, and key architectural decisions - **Project Structure** - Gives Claude a map of the codebase - **ServiceStack Conventions** - DTO patterns, Service implementation, AutoQuery, Authentication, and Validation - **React Integration** - TypeScript DTO generation, API client usage, component patterns, and form handling - **Database Patterns** - OrmLite setup, migrations, and data access patterns - **Common Development Tasks** - Step-by-step guides for adding APIs, implementing features, and extending functionality - **Testing & Deployment** - Test patterns and deployment workflows ### Extending with Project-Specific Details The existing `CLAUDE.md` serves as a solid foundation, but for best results, you should extend it with project-specific details like the purpose of the project, key parts and features of the project and any unique conventions you've adopted. ### Benefits - **Faster Onboarding** - New developers (and AI assistants) understand project conventions immediately - **Consistent Code Generation** - AI tools generate code following your project's patterns - **Better Context** - AI assistants can reference specific ServiceStack patterns and conventions - **Reduced Errors** - Clear documentation of framework-specific conventions - **Living Documentation** - Keep it updated as your project evolves ### How to Use Claude Code and most AI Assistants already support automatically referencing `CLAUDE.md` and `AGENTS.md` files, for others you can just include it in your prompt context when asking for help, e.g: > Using my project's AGENTS.md, can you help me add a new AutoQuery API for managing Products? The AI will understand your App's ServiceStack conventions, React setup, and project structure, providing more accurate and contextual assistance. ### Getting Started All new [angular-spa.web-templates.io](https://angular-spa.web-templates.io) include [AGENTS.md](https://github.com/NetCoreTemplates/angular-spa/blob/main/AGENTS.md) by default. For existing projects, you can adapt the template to document your App's conventions, patterns and technology choices. ## Feature Tour Angular's structured approach to modern web development is ideal for large complex Applications that stitches together various technologies, handling authentication, designing responsive UIs, and managing complex state which the new Angular SPA template embraces to provide a productive starting point with a robust foundation packed with essential features right out of the box. 1. **Built-in Identity Authentication:** Secured out-of-the-box, this template integrates seamlessly with ASP.NET Core Identity, providing ready-to-use registration, login, and User Admin management features. 2. **Tailwind v4 CSS:** Rewritten to use Tailwind v4 CSS, allowing you to rapidly build beautiful, responsive designs directly in your markup. 3. **Dark Mode Support:** Cater to user preferences with built-in, easily toggleable dark mode support, styled elegantly with Tailwind. 4. **Customizable DataGrid Component:** Effortlessly display tabular data with the included customizable DataGrid. Easily adapt it for sorting, filtering and displaying your specific data structures. 5. **Reusable Input Components with Validation:** The template includes reusable, pre-styled input components (e.g., text input, selects) with built-in support for validation bound forms and contextual displaying of validation errors. 6. **RxJS & Signals Support:** Modern Angular reactivity: whether you prefer the established power of **RxJS Observables** or the new granular reactivity of **Angular Signals**, our template is structured to support *both* programming models. We'll take a quick tour to explore the templates features: ### Home Page The home page sports a responsive Tailwind design where all its components are encapsulated within its [/app/home](https://github.com/NetCoreTemplates/angular-spa/tree/main/MyApp.Client/src/app/home) with its logic maintained in `*.ts` files and its presentation UI optionally maintained in a separate `*.html` file. <screenshots-gallery class="not-prose mb-8" grid-class="grid grid-cols-1 md:grid-cols-2 gap-4" :images="{ 'Home': '/img/pages/templates/angular-spa.webp', 'Home - Dark Mode': '/img/pages/templates/angular-spa-dark.webp', }"></screenshots-gallery> ### Dark Mode The [dark-mode-toggle.component.ts](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/components/dark-mode-toggle.component.ts) and [theme.service.ts](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/components/services/theme.service.ts) handles switching between Light and Dark Mode which is initially populated from the Users OS preference. ### Weather The Weather page maintained in [/app/weather](https://github.com/NetCoreTemplates/angular-spa/tree/main/MyApp.Client/src/app/weather) provides a good example of utilizing an RxJS Observable programming model with the [api-http-client.service.ts](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/components/services/api-http-client.service.ts) that extends Angular's Observable HttpClient with an additional `api` method that lets you use your Services typed `dtos.ts` TypeScript DTOs to enable type-safe integration with your back-end services: ```ts import { Forecast, GetWeatherForecast, ResponseStatus } from 'src/dtos' import { ApiHttpClient } from 'src/components/services/api-http-client.service' export class WeatherComponent { http = inject(ApiHttpClient); public error: ResponseStatus | null = null; public forecasts: Forecast[] = []; getForecasts() { this.http.api(new GetWeatherForecast({ date:'2025-04-01' })).subscribe({ next:(result) => { this.error = null; this.forecasts = result; }, error:(error) => { this.error = error; } }); } } ``` Whilst its [weather.component.html](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/app/weather/weather.component.html) template showcases the new [data-grid.component.ts](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/components/data-grid.component.ts) to display a beautiful tailwind DataGrid with just: ```html <data-grid [items]="forecasts"></data-grid> ``` :::{.not-prose .p-4 .mx-auto .max-w-3xl .shadow .rounded-lg} [![](/img/pages/templates/angular-spa/angular-datagrid-default.webp)](https://angular-spa.web-templates.io/weather) ::: It's a direct port of our [Vue DataGrid](https://docs.servicestack.net/vue/datagrid) that also supports the same customizations allowing for custom Headers and Column fields, e.g: ```html <data-grid [items]="forecasts"> <ng-template #dateHeader> <div class="flex items-center"> <span class="font-bold text-green-700 uppercase">Date</span> </div> </ng-template> <ng-template #date let-x="date">{{ x | date:'MMMM d, yyyy' }}</ng-template> <ng-template #temperatureC let-x="temperatureC"> {{ x }}° </ng-template> <ng-template #temperatureF let-x="temperatureF"> {{ x }}° </ng-template> <ng-template #summary let-x="summary">{{ x }}</ng-template> </data-grid> ``` Which renders the expected: :::{.not-prose .p-4 .mx-auto .max-w-3xl .shadow .rounded-lg} [![](/img/pages/templates/angular-spa/angular-datagrid-custom.webp)](https://angular-spa.web-templates.io/weather) ::: ## Todos MVC The Todos MVC App maintained in [/app/todomvc](https://github.com/NetCoreTemplates/angular-spa/tree/main/MyApp.Client/src/app/todomvc) demonstrates how to create the popular [todomvc.com](https://todomvc.com) App in Angular 19. :::{.not-prose .p-4 .mx-auto .max-w-3xl .shadow .rounded-lg} [![](/img/pages/templates/angular-spa/angular-todos.webp)](https://angular-spa.web-templates.io/todomvc) ::: It's another example of building a simple CRUD Application with Angular RxJS Observables and your APIs TypeScript DTOs. This snippet shows how to query and create Todos with the `ApiHttpClient`: ```ts import { Todo, QueryTodos, CreateTodo, ResponseStatus } from 'src/dtos' import { ApiHttpClient } from 'src/components/services/api-http-client.service' export class TodoMvcComponent implements OnInit { client = inject(ApiHttpClient); error: ResponseStatus | null = null; todos: Todo[] = []; newTodoText = ''; loadTodos(): void { this.client.api(new QueryTodos()).subscribe({ next: (todos) => { this.todos = todos.results; }, error: (err) => { this.error = err; } }); } addTodo(): void { if (!this.newTodoText.trim()) return; this.client.api(new CreateTodo({ text: this.newTodoText.trim() })).subscribe({ next: (todo) => { this.todos.push(todo); this.newTodoText = ''; }, error: (err) => { this.error = err; console.error('Error adding todo:', err); } }); } //... } ``` ## Bookings All other examples in the template uses Angular's newer Signal for reactivity and the standard ServiceStack `JsonServiceClient` used in all other TypeScript/JS Apps. The Bookings Pages are maintained in [/app/bookings](https://github.com/NetCoreTemplates/angular-spa/tree/main/MyApp.Client/src/app/bookings) and showcases a more complete example of developing a CRUD UI in Angular starting with an example of how to encapsulate route information for a feature in an isolated [booking.routes.ts](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/app/bookings/booking.routes.ts): ```ts import { Routes } from '@angular/router'; import { BookingListComponent } from './booking-list.component'; import { BookingCreateComponent } from './booking-create.component'; import { BookingEditComponent } from './booking-edit.component'; import { authGuard } from 'src/guards'; export const BOOKING_ROUTES: Routes = [ { path: 'bookings', component: BookingListComponent, canActivate: [authGuard] }, { path: 'bookings/create', component: BookingCreateComponent, canActivate: [authGuard] }, { path: 'bookings/edit/:id', component: BookingEditComponent, canActivate: [authGuard] } ]; ``` The use of the Route `authGuard` ensures only Authenticated Users can access these routes, as well as redirecting non-authenticated users to the Sign In page. ### Bookings List :::{.not-prose .p-4 .mx-auto .max-w-3xl .shadow .rounded-lg} [![](/img/pages/templates/angular-spa/angular-bookings-list.webp)](https://angular-spa.web-templates.io/bookings) ::: The bookings list component shows an example of using Angular's Signals with the `JsonServiceClient` together with an `ApiState` context to enable data bound forms and validation errors: ```ts @Component({ templateUrl: './booking-list.component.html', providers: [ ...provideApiState() ], //... }) export class BookingListComponent implements OnInit { private router = inject(Router); private client = inject(JsonServiceClient); api = inject(ApiState); // Signals for state allBookings = signal<Booking[]>([]); ngOnInit(): void { this.loadBookings(); } async loadBookings(): Promise<void> { this.api.begin(); const api = await this.client.api(new QueryBookings({ orderByDesc: 'BookingStartDate', })); if (api.succeeded) { this.allBookings.set(api.response!.results); } this.api.complete(api.error); } } ``` Using `provideApiState()` implicitly injects the populated API context containing both the APIs Loading and Error state into child components saving you from having to explicitly inject it into each component. E.g. the `<form-loading>` component will display when API Requests are in-flight whilst API Error Responses are displayed after receiving failed API Responses: ```html <app-page title="Bookings" class="max-w-6xl"> <form-loading text="Loading Bookings..."></form-loading> <error-summary></error-summary> @if (allBookings().length > 0) { <data-grid [items]="allBookings()">...</data-grid> } @else { <div class="text-center py-4 bg-gray-50 rounded-md"> <p class="text-gray-500">No bookings found</p> </div> } </app-page> ``` ### Create Booking The [booking-create.component.ts](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/app/bookings/booking-create.component.ts) shows the standard pattern of calling ServiceStack Typed APIs to save forms whilst saving any validation errors to the `ApiState` context: ```ts async save(): Promise<void> { this.api.begin(); const request = new CreateBooking(this.booking()); const api = await this.client.api(request); if (api.succeeded) { // Navigate back to bookings list after successful save this.router.navigate(['/bookings']); } this.api.complete(api.error); } ``` Where any contextual validation will be displayed next to the input field: :::{.not-prose .p-4 .mx-auto .max-w-3xl .shadow .rounded-lg} [![](/img/pages/templates/angular-spa/angular-booking-create-validation.webp)](https://angular-spa.web-templates.io/bookings/create) ::: ### Edit Booking The [booking-edit.component.ts](https://github.com/NetCoreTemplates/angular-spa/blob/main/MyApp.Client/src/app/bookings/booking-edit.component.ts) shows an example of using the `JsonServiceClient` with Signals to get and modify bookings: ```ts export class BookingEditComponent implements OnInit { private route = inject(ActivatedRoute); private router = inject(Router); private client = inject(JsonServiceClient); meta = inject(MetadataService); api = inject(ApiState); // Signals booking = signal<Booking>(new Booking()); ngOnInit(): void { // Get booking ID from route params const id = this.route.snapshot.paramMap.get('id'); if (id) { this.fetchBooking(parseInt(id, 10)); } else { this.api.setErrorMessage('Booking ID is required'); } } async fetchBooking(id: number): Promise<void> { this.api.begin(); const api = await this.client.api(new QueryBookings({id})); if (api.succeeded) { this.booking.set(api.response!.results[0]); } this.api.complete(api.error); } async save(): Promise<void> { this.api.begin(); const api = await this.client.api(new UpdateBooking(this.booking())); if (api.succeeded) { this.router.navigate(['/bookings']); } this.api.complete(api.error); } } ``` :::{.not-prose .p-4 .mx-auto .max-w-3xl .shadow .rounded-lg} [![](/img/pages/templates/angular-spa/angular-booking-edit.webp)](https://angular-spa.web-templates.io/bookings/edit/1) ::: It shows an example of a validation bound form bounded to a signal instance of a `Booking` DTO with summary and contextual validation and utilization of your API's metadata with `meta.enumOptions('RoomType')` which populates the `<select>` drop down with the C# `RoomType` enum values: ```html <app-page title="Edit Booking"> <form-loading></form-loading> @if (booking().id) { <form class="grid gap-4 py-4" (ngSubmit)="save()"> <input class="hidden" type="submit" /> <fieldset [disabled]="api.loading()"> <error-summary except="name,roomType,roomNumber,cost,bookingStartDate,notes" class="mb-4"></error-summary> <div class="grid grid-cols-6 gap-6"> <div class="col-span-6 sm:col-span-3"> <text-input id="name" name="name" required placeholder="Name for this booking" [(ngModel)]="booking().name"></text-input> </div> <div class="col-span-6 sm:col-span-3"> <select-input id="roomType" name="roomType" [options]="meta.enumOptions('RoomType')" [(ngModel)]="booking().roomType"></select-input> </div> <div class="col-span-6 sm:col-span-3"> <text-input type="number" id="roomNumber" name="roomNumber" min="0" required [(ngModel)]="booking().roomNumber"></text-input> </div> <div class="col-span-6 sm:col-span-3"> <text-input type="number" id="cost" name="cost" min="0" required [(ngModel)]="booking().cost"></text-input> </div> <div class="col-span-6 sm:col-span-3"> <text-input type="date" id="bookingStartDate" name="bookingStartDate" required [(ngModel)]="booking().bookingStartDate"></text-input> </div> <div class="col-span-6 sm:col-span-3"> <text-input type="date" id="bookingEndDate" name="bookingEndDate" [(ngModel)]="booking().bookingEndDate"></text-input> </div> <div class="col-span-6"> <textarea-input id="notes" name="notes" rows="6" placeholder="Notes about this booking" [(ngModel)]="booking().notes"></textarea-input> </div> </div> </fieldset> <div class="flex justify-between"> <div> <confirm-delete (delete)="delete()"></confirm-delete> </div> <div class="flex flex-wrap sm:flex-nowrap gap-4"> <secondary-button (click)="close()"> Close </secondary-button> @if (booking().cancelled) { <primary-button type="button" color="green" [disabled]="api.loading()" (click)="cancelBooking(false)"> Restore Booking </primary-button> } @else { <primary-button type="button" color="red" [disabled]="api.loading()" (click)="cancelBooking(true)"> Cancel Booking </primary-button> } <primary-button type="submit" [disabled]="api.loading()"> {{ api.loading() ? 'Saving...' : 'Save Booking' }} </primary-button> </div> </div> </form> } @else { <error-summary></error-summary> } </app-page> ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auth/anti-forgery.md --- title: Anti Forgery slug: anti-forgery --- You can leverage ASP.NET MVC's AntiForgery token support your Razor pages by embedding the token in your HTML Forms with: ## Example ```html <form action="/antiforgery/test" method="POST"> @Html.AntiForgeryToken() <input name="Field" value="Test"/> <input type="submit"/> </form> ``` Which you can then validate in your Service with: ```csharp [Route("/antiforgery/test")] public class AntiForgeryTest { public string Field { get; set; } } public class AntiForgeryService : Service { public object Any(AntiForgeryTest request) { AntiForgery.Validate(); ... } } ``` ::: info ASP.NET MVC's AntiForgery API is only available in the .NET Framework ::: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/api-design.md --- slug: api-design title: ServiceStack API design --- The primary difference between developing RPC vs ServiceStack's [Message-based Services](/what-is-a-message-based-web-service) is that the Services entire contract is defined by its typed messages, specifically the Request DTO which defines both the System inputs and identifies the System output. Typically both are POCO DTOs however the [response can be any serializable object](/service-return-types). The simplest Service example that does this is: ```csharp public class MyRequest : IReturn<MyRequest> {} public class MyServices : Service { public object Any(MyRequest request) => request; } ``` As only the `Any()` wildcard method is defined, it will get executed whenever the `MyRequest` Service is invoked via **any HTTP Verb**, [gRPC](/grpc/), [MQ](/messaging) or [SOAP](/soap-support) Request. The Request DTO is also all that's required to invoke it via any [Typed Generic Service Client](/clients-overview) in any supported language, e.g: ```csharp MyRequest response = client.Get(new MyRequest()); ``` All Services are accessible by their [pre-defined routes](/routing#pre-defined-routes), we can turn it into a functional data-driven Service by annotating it with a [user-defined route](/routing) and changing the implementation to return all App Contacts: ```csharp public class Contact { public int Id { get; set; } public string Name { get; set; } } [Route("/contacts")] public class GetContacts : IReturn<List<Contact>> { } public class ContactsService : Service { public object Get(GetContacts request) => Db.Select<Contact>(); } ``` Which your C# clients will still be able to call with: ```csharp List<Contact> response = client.Get(new GetContacts()); ``` This will make a **GET** call to the custom `/contacts` URL and returns all rows from the `Contact` Table in the configured RDBMS using [OrmLite](/ormlite/) `Select()` extension method on the `base.Db` ADO.NET `IDbConnection` property on ServiceStack's convenience `Service` base class. Using `Get()` limits access to this service from HTTP **GET** requests only, all other HTTP Verbs requests to `/contacts` will return a **404 NotFound** HTTP Error Response. ### Using explicit Response DTO Our recommendation instead of returning naked collections is returning an explicit predictable Response DTO, e.g: ```csharp [Route("/contacts")] public class GetContacts : IReturn<GetContactsResponse> { } public class GetContactsResponse { public List<Contact> Results { get; set; } public ResponseStatus ResponseStatus { get; set; } } public class ContactsService : Service { public object Get(GetContacts request) => new GetContactsResponse { Results = Db.Select<Contact>() }; } ``` Whilst slightly more verbose this style benefits from [more resilience in evolving and versioning](https://stackoverflow.com/a/12413091/85785) message-based Services and more coarse-grained APIs as additional results can be added to the Response DTO without breaking existing clients. You'll also need to follow the above convention if you also wanted to [support SOAP endpoints](/soap-support) or if you want to be able to handle Typed [Response Messages in MQ Services](/messaging#message-workflow). ### All APIs have a preferred default method Like the `Send*` APIs before them, both [API Explorer](/api-explorer) and the new [`Api*` methods](/csharp-client.html#high-level-api-and-apiasync-methods) send API requests using an APIs **preferred HTTP Method** which can be defined either: - Explicitly annotating Request DTOs with `IGet`, `IPost`, etc. **IVerb** interface markers - Using the verb specified in its user-defined `[Route]` attribute (if single verb specified) - Implicitly when using AutoQuery/CRUD Request DTOs - Using the Services **Verb()** implementation method if not using **Any()** If the HTTP Method can't be inferred, it defaults to using HTTP **POST**. But as good API documentation practice, we recommend specifying the HTTP Method each API should use, preferably using the `IVerb` interface marker, so it's embedded into the APIs Services Contract shared with clients (not required for AutoQuery APIs). ## ServiceStack's API Design We'll walk through a few examples here but for a more detailed look into the usages and capabilities of ServiceStack's API design checkout its [Comprehensive Test Suite](https://github.com/ServiceStack/ServiceStack/blob/master/tests/RazorRockstars.Console.Files/ReqStarsService.cs) At a minimum ServiceStack Services only need to implement the `IService` empty interface: ```csharp public interface IService {} ``` The interface is used as a Marker interface that ServiceStack uses to find, register and auto-wire your existing services. Although you're more likely going to want to inherit from ServiceStack's convenience concrete `Service` class which contains easy access to ServiceStack's providers: ```csharp public class Service : IService { IRequest Request { get; } // HTTP Request Context IResponse Response { get; } // HTTP Response Context IServiceGateway Gateway { get; } // Built-in Service Gateway IMessageProducer MessageProducer { get; } // Message Producer for Registered MQ Server void PublishMessage(T message); // Publish messages to Registered MQ Server IVirtualPathProvider VirtualFileSources { get; } // Virtual FileSystem Sources IVirtualFiles VirtualFiles { get; } // Writable Virtual FileSystem ICacheClient Cache { get; } // Registered Caching Provider ICacheClientAsync CacheAsync { get; } // Registered Async Caching Provider (or sync wrapper) MemoryCacheClient LocalCache { get; } // Local InMemory Caching Provider IDbConnection Db { get; } // Registered ADO.NET IDbConnection IRedisClient Redis { get; } // Registered RedisClient ValueTask<IRedisClientAsync> GetRedisAsync(); // Registered Async RedisClient IAuthRepository AuthRepository { get; } // Registered User Repository IAuthRepositoryAsync AuthRepositoryAsync { get; } // Registered Async User Repository ISession SessionBag { get; } // Dynamic Session Bag ISessionAsync SessionBagAsync { get; } // Dynamic Async Session Bag Task<TUserSession> SessionAsAsync<TUserSession>(); // Resolve Typed UserSession Async TUserSession SessionAs<TUserSession>(); // Resolve Typed UserSession IAuthSession GetSession() { get; } // Resolve base IAuthSession Task<IAuthSession> GetSessionAsync(); // Resolve base IAuthSession Async bool IsAuthenticated { get; } // Is Authenticated Request T TryResolve<T>(); // Resolve dependency at runtime T ResolveService<T>(); // Resolve an auto-wired service T GetPlugin<T>(); // Resolve optional registered Plugin T AssertPlugin<T>(); // Resolve required registered Plugin void Dispose(); // Override to implement custom IDispose ValueTask DisposeAsync(); // implement IAsyncDisposable (.NET v4.7.2+) } ``` ### Basic example - Handling Any HTTP Verb Lets revisit the Simple example from earlier: ```csharp [Route("/contacts")] public class GetContacts : IReturn<List<Contact>> { } public class ContactsService : Service { public object Get(GetContacts request) => Db.Select<Contact>(); } ``` ServiceStack maps HTTP Requests to your Services **Actions**. An Action is any method that: - Is `public` - Only contains a **single argument - the typed Request DTO** - Has a Method name matching a **HTTP Method** or **Any** (the fallback that can handle "ANY" method) - Methods can have **Format** suffix to handle specific formats, e.g. if exists `GetJson` will handle **GET JSON** requests - Can specify either `T` or `object` Return type, both have same behavior ### Content-Type Specific Service Implementations Service methods can also use `Verb{Format}` method names to provide a different implementation for handling a specific Content-Type. The Service below defines several different implementation for handling the same Request: ```csharp [Route("/my-request")] public class MyRequest { public string Name { get; set; } } public class ContentTypeServices : Service { public object GetJson(MyRequest request) => ..; // Handles GET /my-request for JSON responses public object GetHtml(MyRequest request) => // Handles GET /my-request for HTML Responses $@"<html> <body> <h1>GetHtml {request.Name}</h1> </body> </html>"; public object AnyHtml(MyRequest request) => // Handles other POST/PUT/etc Verbs for HTML Responses $@"<html> <body> <h1>AnyHtml {request.Name}</h1> </body> </html>"; public object Any(MyRequest request) => ...; // Handles all other unspecified Verbs/Formats } ``` ### Optional *Async Suffixes In addition your Services can optionally have the `*Async` suffix which by .NET Standard (and ServiceStack) guidelines is preferred for Async methods to telegraph to client call sites that its response should be awaited. ```csharp [Route("/contacts")] public class GetContacts : IReturn<List<Contact>> { } public class ContactsService : Service { public async Task<object> GetAsync(GetContacts request) => await Db.SelectAsync<Contact>(); public object GetHtmlAsync(MyRequest request) => $@"<html> <body> <h1>GetHtml {request.Name}</h1> </body> </html>"; } ``` If both exists (e.g. `Post()` and `PostAsync()`) the `*Async` method will take precedence and be invoked instead. Allowing both is useful if you have internal services directly invoking other Services using `HostContext.ResolveService<T>()` where you can upgrade your Service to use an Async implementation without breaking existing clients, e.g. this is used in [RegisterService.cs](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/RegisterService.cs): ```csharp [Obsolete("Use PostAsync")] public object Post(Register request) { try { var task = PostAsync(request); return task.GetResult(); } catch (Exception e) { throw e.UnwrapIfSingleException(); } } /// <summary> /// Create new Registration /// </summary> public async Task<object> PostAsync(Register request) { //... async impl } ``` To change to use an async implementation whilst retaining backwards compatibility with existing call sites, e.g: ```csharp using var service = HostContext.ResolveService<RegisterService>(Request); var response = service.Post(new Register { ... }); ``` This is important if the response is ignored as the C# compiler wont give you any hints to await the response which can lead to timing issues where the Services is invoked but User Registration hasn't completed as-is often assumed. Alternatively you can rename your method to use `*Async` suffix so the C# compiler will fail on call sites so you can replace the call-sites to `await` the async `Task` response, e.g: ```csharp using var service = HostContext.ResolveService<RegisterService>(Request); var response = await service.PostAsync(new Register { ... }); ``` ### Group Services by Tag Related Services by can be grouped by annotating **Request DTOs** with the `[Tag]` attribute where they'll enable functionality in a number of ServiceStack's metadata services where they'll be used to [Group Services in Open API](https://swagger.io/docs/specification/grouping-operations-with-tags/). This feature could be used to tag which Services are used by different platforms: ```csharp [Tag("web")] public class WebApi : IReturn<MyResponse> {} [Tag("mobile")] public class MobileApi : IReturn<MyResponse> {} [Tag("web"),Tag("mobile")] public class WebAndMobileApi : IReturn<MyResponse> {} ``` Where they'll appear as a tab to additionally filter APIs in metadata pages: ![](/img/pages/metadata/tag-groups.webp) They're also supported in [Add ServiceStack Reference](/add-servicestack-reference) where it can be used in the [IncludeTypes](/csharp-add-servicestack-reference#includetypes) DTO customization option where tags can be specified using braces in the format `{tag}` or `{tag1,tag2,tag3}`, e.g: ``` /* Options: IncludeTypes: {web,mobile} ``` Or individually: ``` /* Options: IncludeTypes: {web},{mobile} ``` It works similar to [Dependent Type References wildcard syntax](/csharp-add-servicestack-reference#include-request-dto-and-its-dependent-types) where it expands all Request DTOs with the tag to include all its reference types so including a `{web}` tag would be equivalent to including all Request DTOs & reference types with that reference, e.g: ``` /* Options: IncludeTypes: WebApi.*,WebAndMobileApi.* ``` ### Micro ORMs and ADO.NET's IDbConnection Code-First Micro ORMS like [OrmLite](/ormlite/) and [Dapper](https://github.com/StackExchange/Dapper) provides a pleasant high-level experience whilst working directly against ADO.NET's low-level `IDbConnection`. They both support all major databases so you immediately have access to a flexible RDBMS option out-of-the-box. At the same time you're not limited to using the providers contained in the `Service` class and can continue to use your own register IOC dependencies (inc. an alternate IOC itself). ### Micro ORM POCOs make good DTOs The POCOs used in Micro ORMS are particularly well suited for re-using as DTOs since they don't contain any circular references that the Heavy ORMs have (e.g. EF). OrmLite goes 1-step further and borrows pages from NoSQL's playbook where any complex property e.g. `List<MyPoco>` is transparently blobbed in a schema-less text field, promoting the design of frictionless **Pure POCOS** that are uninhibited by RDBMS concerns. In many cases these POCO data models already make good DTOs and can be returned directly instead of mapping to domain-specific DTOs. ### Calling Services from a Typed C# Client In Service development your services DTOs provides your technology agnostic **Service Layer** which you want to keep clean and as 'dependency-free' for maximum accessibility and potential re-use. Our recommendation is to follow our [Recommended Physical Project Structure](/physical-project-structure) and keep your DTOs in a separate ServiceModel project which ensures a well-defined ServiceContract [decoupled from their implementation and accessible from any client](/service-complexity-and-dto-roles#data-transfer-objects---dtos). This recommended Physical project structure is embedded in each [ServiceStack VS.NET Template](/templates/). One of ServiceStack's strengths is its ability to re-use your Server DTOs on the client enabling ServiceStack's productive end-to-end typed API. ServiceStack's use of Typed DTOs in its message-based design enable greater resiliency for your Services where the exact DTOs aren't needed, only the shape of the DTOs is important and clients can also opt to use partial DTOs containing just the fields they're interested in. In the same way extending existing Services with new optional properties wont break existing clients using older DTOs. When developing both Server and Client applications the easiest way to call typed Services from clients is to just have them reference the same ServiceModel .dll the Server uses to define its Service Contract, or for clients that only need to call a couple of Service you can choose to instead copy the class definitions as-is, in both cases calling Services is exactly the same where the Request DTO can be used with any of the generic [C#/.NET Service Clients](/csharp-client) to call Services using a succinct typed API, e.g: #### Service Model Classes ```csharp [Route("/contacts")] public class GetContacts : IReturn<List<Contact>> { } public class Contact { ... } ``` Which can used in any ServiceClient with: ```csharp var client = new JsonApiClient(BaseUri); List<Contact> response = client.Get(new GetContacts()); ``` Which makes a **GET** web request to the `/contacts` route. Custom Routes on Request DTO's are also not required as when none are defined the client automatically falls back to using ServiceStack's [pre-defined routes](/routing#pre-defined-routes). ### Generating Typed DTOs In addition to being able to share your `ServiceModel.dll` on .NET Clients enable a typed end-to-end API without code-gen, clients can alternatively choose to use [Add ServiceStack Reference](/csharp-add-servicestack-reference) support to provide an alternative way to get the Services typed DTOs on the client. In both cases the exact same source code is used to call the Services: ```csharp var client = new JsonApiClient(BaseUri); var response = client.Get(new GetContacts()); ``` Add ServiceStack Reference is also available for [most popular languages](/add-servicestack-reference) used in developing Web, Mobile and Desktop Apps. #### Custom API Requests When preferred, you can also use the previous more explicit client API (ideal for when you don't have the `IReturn<>` marker) which lets you call the Service using just its route: ```csharp var response = client.Get<List<Contact>>("/contacts"); ``` ::: info All these Service Client APIs **have async equivalents** with an `*Async` suffix ::: ### API QueryParams ServiceStack's message-based design is centered around sending a single message which is all that's required to invoke any Typed API, however there may be times when you need to send additional params where you can't change the API's Request DTO definition or in AutoQuery's case its [Implicit Conventions](/autoquery/rdbms#implicit-conventions) would require too many permutations to be able to type the entire surface area on each Request DTO. Typically this would inhibit being able to invoke these Services from a typed Service Client API that would instead need to either use the untyped [`Get<T>(relativeUrl)`](https://reference.servicestack.net/api/ServiceStack/IRestClient/#-gettresponsestring) ServiceClient APIs or [HTTP Utils](/http-utils) to construct the API Request path manually. Alternatively Request DTOs can implement `IHasQueryParams` where any entries will be sent as additional query params along with the typed DTO: ```csharp public interface IHasQueryParams { Dictionary<string, string> QueryParams { get; set; } } ``` Which is available in all AutoQuery DTOs where it's added as a non-serializable property so it's only included in the QueryString: ```csharp [DataContract] public abstract class QueryBase : IQuery, IHasQueryParams { //... [IgnoreDataMember] public virtual Dictionary<string, string> QueryParams { get; set; } } ``` Which allows using existing ServiceClient typed APIs to send a combination of untyped queries in AutoQuery requests, e.g: ```csharp var api = await client.ApiAsync(new QueryContacts { IdsIn = new[]{ 1, 2, 3 }, QueryParams = new() { ["LastNameStartsWith"] = "A" } }); ``` ## Everything centered around Request DTOs A nice property of ServiceStack's message-based design is all functionality is centered around Typed Request DTOs which easily lets you take advantage of high-level value-added functionality like [Auto Batched Requests](/auto-batched-requests) or [Encrypted Messaging](/auth/encrypted-messaging) which are enabled automatically without any effort or easily opt-in to enhanced functionality by decorating Request DTOs or thier Services with Metadata and [Filter Attributes](/filter-attributes) and everything works together, binded against typed models naturally. E.g. you can take advantage of [ServiceStack's Razor support](https://razor.netcore.io/) and create a web page for this service by just adding a Razor view with the same name as the Request DTO in the `/Views` folder, which for the `GetContacts` Request DTO you can just add `/Views/GetContacts.cshtml` and it will get rendered with the Services Response DTO as its View Model when the Service is called from a browser (i.e. HTTP Request with `Accept: text/html`). Thanks to ServiceStack's built-in Content Negotiation you can fetch the HTML contents calling the same url: ```csharp var html = $"{BaseUri}/contacts".GetStringFromUrl(accept:"text/html"); ``` This [feature is particularly nice](https://razor.netcore.io/#unified-stack) as it lets you **re-use your existing services** to serve both Web and Native Mobile and Desktop clients. ### Action Filters Service actions can also contain fine-grained application of Request and Response filters, e.g: ```csharp public class ContactsService : Service { [ClientCanSwapTemplates] public object Get(GetContacts request) => Db.Select<Contact>(); } ``` This Request Filter allows the client to [change the selected Razor **View** and **Template**](https://razor.netcore.io/#unified-stack) used at runtime. By default the view with the same name as the **Request** or **Response** DTO is used. ## Handling different HTTP Verbs ServiceStack Services lets you handle any HTTP Verb in the same way, e.g this lets you respond with CORS headers to a HTTP **OPTIONS** request with: ```csharp public class ContactsService : Service { [EnableCors] public void Options(GetContact request) {} } ``` Which if you now make an OPTIONS request to the above service, will emit the default `[EnableCors]` headers: ```csharp var webReq = (HttpWebRequest)WebRequest.Create(Host + "/contacts"); webReq.Method = "OPTIONS"; using var webRes = webReq.GetResponse(); webRes.Headers["Access-Control-Allow-Origin"] // * webRes.Headers["Access-Control-Allow-Methods"] // GET, POST, PUT, DELETE, OPTIONS webRes.Headers["Access-Control-Allow-Headers"] // Content-Type ``` ### PATCH request example Handling a PATCH request is just as easy, e.g. here's an example of using PATCH to handle a partial update of a Resource: ```csharp [Route("/contacts/{Id}", "PATCH")] public class UpdateContact : IReturn<Contact> { public int Id { get; set; } public int Age { get; set; } } public Contact Patch(UpdateContact request) { var Contact = request.ConvertTo<Contact>(); Db.UpdateNonDefaults(Contact); return Db.SingleById<Contact>(request.Id); } ``` And the client call is just as easy as you would expect: ```csharp var response = client.Patch(new UpdateContact { Id = 1, Age = 18 }); ``` Although sending different HTTP Verbs are unrestricted in native clients, they're unfortunately not allowed in some web browsers and proxies. So to simulate a PATCH from an AJAX request you need to set the **X-Http-Method-Override** HTTP Header. ## Structured Error Handling When following the [explicit Response DTO Naming convention](/error-handling#error-response-types) ServiceStack will automatically populate the `ResponseStatus` property with a structured Error Response otherwise if returning other DTOs like naked collections ServiceStack will instead return a generic `ErrorResponse`, although this is mostly a transparent technical detail you don't need to know about as for schema-less formats like JSON they return the exact same wire-format. [Error Handling](/error-handling) works naturally in ServiceStack where you can simply throw C# Exceptions, e.g: ```csharp public List<Contact> Post(Contact request) { if (!request.Age.HasValue) throw new ArgumentException("Age is required"); Db.Insert(request.ConvertTo<Contact>()); return Db.Select<Contact>(); } ``` This will result in an Error thrown on the client if it tried to create an empty Contact: ```csharp try { var response = client.Post(new Contact()); } catch (WebServiceException webEx) { webEx.StatusCode // 400 webEx.StatusDescription // ArgumentException webEx.ResponseStatus.ErrorCode // ArgumentException webEx.ResponseStatus.Message // Age is required webEx.ResponseDto is ErrorResponse // true } ``` The same Service Clients Exception handling is also used to handle any HTTP error generated in or outside of your service, e.g. here's how to detect if a HTTP Method isn't implemented or disallowed: ```csharp try { var response = client.Send(new SearchContacts()); } catch (WebServiceException webEx) { webEx.StatusCode // 405 webEx.StatusDescription // Method Not Allowed } ``` In addition to standard C# exceptions your services can also return multiple, rich and detailed validation errors as enforced by [Fluent Validation's validators](/validation). ### Overriding the default Exception handling You can override the default exception handling in ServiceStack by registering a `ServiceExceptionHandlers`, e.g: ```csharp void Configure(Container container) { this.ServiceExceptionHandlers.Add((req, reqDto, ex) => { return ...; }); } ``` ## Smart Routing For the most part you won't need to know about this as ServiceStack's routing works as you would expect. Although this should still serve as a good reference to describe the resolution order of ServiceStack's Routes: 1. Any exact Literal Matches are used first 2. Exact Verb match is preferred over All Verbs 3. The more variables in your route the less weighting it has 4. When Routes have the same weight, the order is determined by the position of the Action in the service or Order of Registration (FIFO) These Rules only come into play when there are multiple routes that matches the pathInfo of an incoming request. Lets see some examples of these rules in action using the routes defined in the [API Design test suite](https://github.com/ServiceStack/ServiceStack/blob/master/tests/RazorRockstars.Console.Files/ReqStarsService.cs): ```csharp [Route("/contacts")] public class Contact {} [Route("/contacts", "GET")] public class GetContacts {} [Route("/contacts/{Id}", "GET")] public class GetContact {} [Route("/contacts/{Id}/{Field}")] public class ViewContact {} [Route("/contacts/{Id}/delete")] public class DeleteContact {} [Route("/contacts/{Id}", "PATCH")] public class UpdateContact {} [Route("/contacts/reset")] public class ResetContact {} [Route("/contacts/search")] [Route("/contacts/aged/{Age}")] public class SearchContacts {} ``` These are results for these HTTP Requests ``` GET /contacts => GetContacts POST /contacts => Contact GET /contacts/search => SearchContacts GET /contacts/reset => ResetContact PATCH /contacts/reset => ResetContact PATCH /contacts/1 => UpdateContact GET /contacts/1 => GetContact GET /contacts/1/delete => DeleteContact GET /contacts/1/foo => ViewContact ``` And if there were multiple of the exact same routes declared like: ```csharp [Route("/req/{Id}", "GET")] public class Req2 {} [Route("/req/{Id}", "GET")] public class Req1 {} public class MyService : Service { public object Get(Req1 request) { ... } public object Get(Req2 request) { ... } } ``` The Route on the Action that was declared first gets selected, i.e: ``` GET /req/1 => Req1 ``` ### Populating Complex Type Properties on QueryString ServiceStack uses the [JSV-Format](/jsv-format) (JSON without quotes) to parse QueryStrings. JSV lets you embed deep object graphs in QueryString as seen [this example url](https://test.servicestack.net/json/reply/StoreLogs?Loggers=%5B%7BId:786,Devices:%5B%7BId:5955,Type:Panel,TimeStamp:1199303309,Channels:%5B%7BName:Temperature,Value:58%7D,%7BName:Status,Value:On%7D%5D%7D,%7BId:5956,Type:Tank,TimeStamp:1199303309,Channels:%5B%7BName:Volume,Value:10035%7D,%7BName:Status,Value:Full%7D%5D%7D%5D%7D%5D): ``` https://test.servicestack.net/json/reply/StoreLogs?Loggers=[{Id:786,Devices:[{Id:5955,Type:Panel, Channels:[{Name:Temperature,Value:58},{Name:Status,Value:On}]}, {Id:5956,Type:Tank,TimeStamp:1199303309, Channels:[{Name:Volume,Value:10035},{Name:Status,Value:Full}]}]}] ``` ## Advanced Usages ### Custom Hooks The ability to extend ServiceStack's service execution pipeline with Custom Hooks is an advanced customization feature that for most times is not needed as the preferred way to add composable functionality to your services is to use [Request / Response Filter attributes](/filter-attributes) or apply them globally with [Global Request/Response Filters](/request-and-response-filters). ### Custom Serialized Responses The new `IHttpResult.ResultScope` API provides an opportunity to execute serialization within a custom scope, e.g. this can be used to customize the serialized response of adhoc services that's different from the default global configuration with: ```csharp return new HttpResult(dto) { ResultScope = () => JsConfig.With(new Config { IncludeNullValues = true }) }; ``` Which enables custom serialization behavior by performing the serialization within the custom scope, equivalent to: ```csharp using (JsConfig.With(new Config { IncludeNullValues = true })) { var customSerializedResponse = Serialize(dto); } ``` ### Request and Response Converters The [Encrypted Messaging Feature](/auth/encrypted-messaging) takes advantage of Request and Response Converters that let you change the Request DTO and Response DTO's that get used in ServiceStack's Request Pipeline where: #### Request Converters Request Converters are executed directly after any [Custom Request Binders](/serialization-deserialization#create-a-custom-request-dto-binder): ```csharp appHost.RequestConverters.Add(async (req, requestDto) => { //Return alternative Request DTO or null to retain existing DTO }); ``` #### Response Converters Response Converters are executed directly after the Service: ```csharp appHost.ResponseConverters.Add(async (req, response) => //Return alternative Response or null to retain existing Service response }); ``` ### Intercept Service Requests As an alternative to creating a [Custom Service Runner](#using-a-custom-servicerunner) to intercept different events when processing ServiceStack Requests, you can instead override the `OnBeforeExecute()`, `OnAfterExecute()` and `OnExceptionAsync()` callbacks in your `Service` class (or base class) to intercept and modify Request DTOs, Responses or Error Responses, e.g: ```csharp class MyServices : Service { // Log all Request DTOs that implement IHasSessionId public override void OnBeforeExecute(object requestDto) { if (requestDto is IHasSessionId dtoSession) { Log.Debug($"{nameof(OnBeforeExecute)}: {dtoSession.SessionId}"); } } //Return Response DTO Name in HTTP Header with Response public override object OnAfterExecute(object response) { return new HttpResult(response) { Headers = { ["X-Response"] = response.GetType().Name } }; } //Return custom error with additional metadata public override Task<object> OnExceptionAsync(object requestDto, Exception ex) { var error = DtoUtils.CreateErrorResponse(requestDto, ex); if (error is IHttpError httpError) { var errorStatus = httpError.Response.GetResponseStatus(); errorStatus.Meta = new Dictionary<string,string> { ["InnerType"] = ex.InnerException?.GetType().Name }; } return Task.FromResult(error); } } ``` #### Async Callbacks For async callbacks your Services can implement `IServiceBeforeFilterAsync` and `IServiceAfterFilterAsync`, e.g: ```csharp public class MyServices : Service, IServiceBeforeFilterAsync, IServiceAfterFilterAsync { public async Task OnBeforeExecuteAsync(object requestDto) { //... } public async Task<object> OnAfterExecuteAsync(object response) { //... return response; } } ``` If you're implementing `IService` instead of inheriting the concrete `Service` class, you can implement the interfaces directly: ```csharp // Handle all callbacks public class MyServices : IService, IServiceFilters { //.. } // Or individually, just the callbacks you want public class MyServices : IService, IServiceBeforeFilter, IServiceAfterFilter, IServiceErrorFilter { //.. } ``` ### Custom Service Runner The [IServiceRunner](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IServiceRunner.cs) decouples the execution of your service from the implementation of it which provides an alternative custom hook which lets you add custom behavior to all Services without needing to use a base Service class. To add your own Service Hooks you just need to override the default Service Runner in your AppHost from its default implementation: ```csharp public virtual IServiceRunner<TRequest> CreateServiceRunner<TRequest>(ActionContext actionContext) { return new ServiceRunner<TRequest>(this, actionContext); //Cached per Service Action } ``` With your own: ```csharp public override IServiceRunner<TRequest> CreateServiceRunner<TRequest>(ActionContext actionContext) { return new MyServiceRunner<TRequest>(this, actionContext); //Cached per Service Action } ``` Where `MyServiceRunner<T>` is just a custom class implementing the custom hooks you're interested in, e.g: ```csharp public class MyServiceRunner<T> : ServiceRunner<T> { public override OnBeforeExecute(IRequest req, TRequest request, object service) { // Called just before any Action is executed } public override Task<object> ExecuteAsync(IRequest req, object instance, TRequest requestDto) { // Called to execute the Service instance with the requestDto return base.ExecuteAsync(req, serviceInstance, requestDto); } public override object OnAfterExecute(IRequest req, object response, object service) { // Called just after any Action is executed, you can modify the response returned here as well } public override Task<object> HandleExceptionAsync(IRequest req, TRequest requestDto, Exception ex, object instance) { // Called whenever an exception is thrown in your Services Action } } ``` ## Limitations One limitation of Services is that you can't split the handling of a single Resource (i.e. Request DTO) over multiple service implementations. If you find you need to do this because your service is getting too big, consider using partial classes to spread the implementation over multiple files. Another option is encapsulating some of the re-usable functionality into Logic dependencies and inject them into your service. ## Other Notes Although they're not needed or used anywhere [you can also use HTTP Verb interfaces](https://github.com/ServiceStack/ServiceStack/blob/34acc429ee04053ea766e4fb183e7aad7321ef5e/src/ServiceStack.Interfaces/IService.cs#L27) to enforce the correct signature required by the services, e.g: ```csharp public class MyService : Service, IAny<GetContacts>, IGet<SearchContacts>, IPost<Contact> { public object Any(GetContacts request) { .. } public object Get(SearchContacts request) { .. } public object Post(Contact request) { .. } } ``` This has no effect to the runtime behaviour and your services will work the same way with or without the added interfaces. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/api-explorer.md --- title: API Explorer --- API Explorer is a Postman & Swagger UI alternative built into every ServiceStack **v6+** App that lets you explore, discover & call your APIs with an Auto UI dynamically generated from your APIs typed C# classes. It's built from the ground up with multiple levels of customizations, supporting both declarative & programmatic models for customizing each properties Input UI control, each APIs form grid layout whilst also providing the ability to provide rich interactive HTML Components to document each of your APIs & their Types. This video provides a quick overview of API Explorer's v1 featureset: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="lUDlTMq9DHU" style="background-image: url('https://img.youtube.com/vi/lUDlTMq9DHU/maxresdefault.jpg')"></lite-youtube> ::: info DEMO A Live demo is available at **/ui** in all ServiceStack **v6+** Apps, e.g: [blazor-vue.web-templates.io/ui](https://blazor-vue.web-templates.io/ui) ::: The entire API Explorer UI is driven by the **rich metadata** around your APIs typed Service Contracts and AppHost's **registered plugins**. - The **Sidebar** - Displaying a list of APIs each user has access to - The **API** tab - Providing dynamic form to call & inspect your APIs - The **Details** tab - Containing a complete description of your APIs & its dependent types - The **Code** tab - Letting API consumers browse API Service contracts in their preferred language ([11 languages supported](https://servicestack.net/service-reference)) <p class="text-center py-4 text-xl">Lets learn about each feature with screenshots 📷</p> If your AppHost has the ServiceStack [AuthFeature](/auth/authentication-and-authorization) plugin registered, the home page will display a **Sign In** dialog based on the its **configured Auth Providers**. This is what you'll see in a new [Vue Vite](https://blazor-vue.web-templates.io/ui) project which has **Credentials** Auth, **JWT** as well as **Facebook**, **Google** and **Microsoft** OAuth providers registered in `Configure.Auth.cs`: ```csharp Plugins.Add(new AuthFeature(() => new CustomUserSession(), new IAuthProvider[] { new JwtAuthProvider(appSettings) { AuthKeyBase64 = appSettings.GetString("AuthKeyBase64"), }, new CredentialsAuthProvider(appSettings), new FacebookAuthProvider(appSettings), new GoogleAuthProvider(appSettings), new MicrosoftGraphAuthProvider(appSettings), }) { IncludeDefaultLogin = false }); ``` ## Integrated Sign In Where it will dynamically render the **Sign Up** form with the App's enabled Auth capabilities. <a href="https://blazor-vue.web-templates.io/ui" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/signin.png"> </a> Custom Auth Providers can provide their own Form Layout by overriding the `FormLayout`, e.g. the above Credentials UI is creatable with: ```csharp public class CustomCredentialsAuthProvider : CredentialsAuthProvider { public CustomCredentialsAuthProvider() { FormLayout = new() { Input.For<Authenticate>(x => x.UserName, c => { c.Label = "Email address"; c.Required = true; }), Input.For<Authenticate>(x => x.Password, c => { c.Type = "Password"; c.Required = true; }), Input.For<Authenticate>(x => x.RememberMe), }; } //... } ``` Where the `Input` utility can be used to configure most HTML Form Input control properties that automatically configures to use the desired Input control for each property Type. You can Sign In with any of the users in the [Vue Vite Sign In](https://blazor-vue.web-templates.io/signin) page configured in `Configure.AuthRepository.cs`, i.e: | Username | Password | Role | | ------------------ | -------- | -------- | | admin@email.com | p@55wOrd | Admin | | manager@email.com | p@55wOrd | Manager | | employee@email.com | p@55wOrd | Employee | If signed in with the **Admin** User and the [Admin Users](/admin-ui-users) plugin is configured: ```csharp Plugins.Add(new AdminUsersFeature()); ``` It also displays **Admin UI** links that only **Admin** Users have access to. <a href="https://blazor-vue.web-templates.io/ui" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/admin-user.png"> </a> If you'd like to, you can add personalized links for users in different roles, e.g. this is what's used to populate the above UI for Admins: ```csharp appHost.AddToAppMetadata(meta => { meta.Plugins.Auth.RoleLinks[RoleNames.Admin] = new List<LinkInfo> { new() { Href = "../admin-ui", Label = "Dashboard", Icon = Svg.ImageSvg(Svg.Create(Svg.Body.Home)) }, new() { Href = "../admin-ui/users", Label = "Manage Users", Icon = Svg.ImageSvg(Svg.GetImage(Svg.Icons.Users, "currentColor")) }, }; }); ``` Once signed in, API Explorer expands to include all the protected APIs the signed in user has access to, identifiable with the padlock icon. ### Disable API Explorer ::: info API Explorer is powered by the rich API metadata provided by the `MetadataFeature` and can be customized through the `UiFeature` plugin. Removing either plugin disables API Explorer. ```csharp Plugins.RemoveAll(x => x is UiFeature); ``` ::: ## API Tab After selecting an API to use from the left-hand menu, you will be greeted with a way to **call APIs** through an **Auto UI** generated based on the **Request DTO** schema. Submitting the form returns API results with: - **Body** displaying a syntax highlighted JSON response - **Raw** showing raw JSON output in a textarea - **Preview** tab displaying results in a human-friendly view ::: info The `Raw` response forces a `CamelCase` response since the API Explorer interface needs consistent casing outside your applications default `TextCase`. Those using `SnakeCase` or `PascalCase` will see a different response outside of API Explorer. ::: <a href="https://blazor-vue.web-templates.io/ui/QueryBookings?body=preview" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-form-QueryBookings.png"> </a> Control types are based on the property types in your DTOs. | UI Input | Data Types | | ----------------------- | ---------- | | `<select>` | Enum, Custom Values | | `<input type=number>` | Numeric Types | | `<input type=date>` | DateTime, DateTimeOffset, DateOnly | | `<input type=time>` | TimeSpan, TimeOnly | | `<input type=checkbox>` | Boolean | | `<input type=text>` | default | Where the `CreateBooking` Request DTO defined in [Bookings.cs](https://github.com/NetCoreTemplates/blazor-vue/blob/main/MyApp.ServiceModel/Bookings.cs): ```csharp [Tag("bookings"), Description("Create a new Booking")] [Route("/bookings", "POST")] [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditCreate)] public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { [Description("Name this Booking is for"), ValidateNotEmpty] public string Name { get; set; } public RoomType RoomType { get; set; } [ValidateGreaterThan(0)] public int RoomNumber { get; set; } [ValidateGreaterThan(0)] public decimal Cost { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [Input(Type = "textarea")] public string? Notes { get; set; } } ``` Generates the following UI: <div class="flex justify-center py-8"> <a href="https://blazor-vue.web-templates.io/ui/CreateBooking"> <img src="/img/pages/apiexplorer/api-form-CreateBooking.png" style="max-width:850px;"> </a> </div> This also shows how we can use the [[Input]](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/InputAttribute.cs) attribute to further customize the Input UI control for each property as a **declarative alternative** to using `Input.For<T>()` above that the `Notes` property utilizes to change to use a **textarea** control instead. API Form also supports auto binding [Argument Exceptions](/error-handling) or [Fluent](/validation) & [Declarative](/declarative-validation) Validation rules where any validation errors will be contextually displayed next to the invalid property. Here's both the resulting invalid UI & the Error Response DTO that generated it: <div class="flex justify-center py-8"> <a href="https://blazor-vue.web-templates.io/ui/CreateBooking"> <img src="/img/pages/apiexplorer/api-form-CreateBooking-invalid.png" style="max-width:850px;"> </a> </div> Contextual validation errors is used where possible, otherwise an **Error Summary** notification is displayed along with the API Response body containing the full API error information. ### JSON Form Another useful API form feature is being able to call APIs with a **JSON request payload** which maintains a **2-way sync** with the Form's UI allowing you to quickly cycle between input modes to quickly construct your API request. Real-time JSON validation is also displayed for added assistance, warning you whenever the JSON is malformed. <div class="flex justify-center py-8"> <a href="https://blazor-vue.web-templates.io/ui/CreateBooking"> <img src="/img/pages/apiexplorer/api-form-CreateBooking-json.png" style="max-width:850px;"> </a> </div> ## Details Tab API Explorer also provides a place for users to find out more about your API through documentation generated by metadata about in your API and optionally custom HTML modules to give additional context. This is where API consumers would go to learn about each API where it displays all relevant information about the API at a glance. For `CreateBooking` it shows that: - **POST** is the APIs **preferred** HTTP Method - List its **user-defined** and **pre-defined** routes - It's a **protected** API limited to Authenticated Users with the **Employee** role - It's categorized in the **bookings** tag group - It's an [AutoQuery CRUD](/autoquery/crud) API implementing `ICreateDb<Booking>` indicating it creates entries in the **Booking** RDBMS Table - It returns an `IdResponse` which we can intuitively infer returns the new Booking **Id** for successfully created Bookings <a href="https://blazor-vue.web-templates.io/ui/CreateBooking?tab=details" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-details-CreateBooking.png"> </a> This API Definition was generated from the `CreateBooking` DTO shows that the **Required** column used to document the APIs required properties is required for all properties except for the **nullable** Value and Reference Types when `#nullable` is enabled. ```csharp public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { [Description("Name this Booking is for"), ValidateNotEmpty] public string Name { get; set; } public RoomType RoomType { get; set; } [ValidateGreaterThan(0)] public int RoomNumber { get; set; } [ValidateGreaterThan(0)] public decimal Cost { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [Input(Type = "textarea")] public string? Notes { get; set; } } ``` ::: info Importantly **Required** annotations are only documentative, your API still has to validate **required reference types** like `string` using your preferred validation method, e.g. Using the `[ValidateNotEmpty]` declarative attribute, [Fluent Validation](/validation) or manual verification in your Service C# implementation and throwing `ArgumentException` for invalid properties ::: All this data is inferred from your services, with the ability to present data from additional metadata attributes such as: | Attribute name | Description | |-------------------------|-----------------------------------------------------------------| | `[Description]` | Class and properties text only description. | | `[Notes]` | Class only text and HTML description. | | `[Tag]` | Class only categorization of services, a way to group services. | | `[Input]` | Properties only presentation data for input fields. | If services require authentication using `Authenticate` or validation checking for role or permission, services will be shown with a padlock (🔒) signifying requiring authentication. Request and response names are links to show C# generated code representations of your DTOs and dependent types. Text metadata such as `[Description]` will also flow down into the generated code as comments for additional context. <a class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-details-QueryBookings-code.png"> </a> ## Code Tab The **Code** tab allows developers consuming your APIs from different programming backgrounds explore each APIs in their **preferred programming language** - currently supporting [11 different languages](https://servicestack.net/service-reference). It includes the necessary steps to call your APIs from client Apps, following the same pattern for each language: 1. Copy your API DTOs 2. Copy and install the package containing ServiceStack's generic JSON Service Client 3. Copy the initial source code pre-configured to call the API they want At which point without any code-gen or build tools, they'll end up with an Typed API configured to your APIs endpoint. E.g. this is what it looks like to **Python** developers utilizing our [most recently supported language](/releases/v5_12): <a href="https://blazor-vue.web-templates.io/ui/QueryBookings?tab=code&lang=python" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-code-QueryBookings-python.png"> </a> That they can follow to quickly incorporate your API into their existing Apps, in addition **Python**, **C#** or **F#** developers could also copy this source code to their [Jupyter Notebooks](/jupyter-notebooks) for an instant typed visual REPL to explore your APIs. This is driven by the services that power the [Add ServiceStack Reference](./add-servicestack-reference.md) feature. This growing list of support languages shows example code using the specific API as well as required ServiceStack client libraries to use, and how to update the APIs DTOs. ## Responsive Design API Explorer's responsive layout works well in Smart Phones and Tablets letting you comfortably browse and call APIs on the go: <div class="bg-gray-200 flex justify-center py-8"> <img src="/img/pages/apiexplorer/api-explorer.gif" style="width:500px; border-left:1px solid #CACACA;border-bottom:1px solid #CACACA;"> </div> ## API Customizations To become the preferred solution to document APIs, API Explorer was designed from scratch to support multiple customization levels, from being able to customize each properties Input control, its Form Grid Layout and further annotating each API or Type with declarative attributes & rich markup. ### API Annotations Whilst the capability of adding rich API Docs is essential when needed, we expect plain C# attributes will often be used to document APIs where `[Description]` can be used to provide a short summary on a **Type** and its **Properties** whilst richer HTML markup can be added to any Type using `[Notes]` as done in [Bookings.cs](https://github.com/NetCoreTemplates/blazor-vue/blob/main/MyApp.ServiceModel/Bookings.cs): ```csharp [Tag("bookings"), Description("Find Bookings")] [Notes("Find out how to quickly create a <a class='svg-external' target='_blank' href='https://youtu.be/rSFiikDjGos'>C# Bookings App from Scratch</a>")] [Route("/bookings", "GET")] [Route("/bookings/{Id}", "GET")] [AutoApply(Behavior.AuditQuery)] public class QueryBookings : QueryDb<Booking> { public int? Id { get; set; } } [Description("Booking Details")] [Notes("Captures a Persons Name & Room Booking information")] public class Booking : AuditBase { [AutoIncrement] public int Id { get; set; } //... } ``` Where it generates clean API docs displayed in a human-friendly table layout containing properties of its **Request DTO** Type and inherited **base class** properties, starting with the APIs Request DTO followed by all its referenced dependent types - resulting in the details page containing a complete snapshot of all types used in the API: <a href="https://blazor-vue.web-templates.io/ui/QueryBookings?tab=details" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-details-QueryBookings.png"> </a> ## API Docs We can further enhance API Explorer with our own custom [Vue.js Components](https://vuejs.org/guide/essentials/component-basics.html) by adding them to your Host projects local `/modules/ui/docs` folder which the Blazor WASM project template utilizes to showcase some customization examples: <ul class="list-none"> <li> <a href="https://github.com/LegacyTemplates/blazor-wasm/tree/main/MyApp/wwwroot/modules" class="font-medium">/modules</a> <ul class="list-none"> <li> <span class="font-medium">/ui/docs</span> <ul class="list-none"> <li> <a href="https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Blazor/tests/ServiceStack.Blazor.Bootstrap.Tests/Server/modules/ui/docs/CreateBookingDocs.mjs"> CreateBookingDocs.mjs </a> </li> <li> <a href="https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Blazor/tests/ServiceStack.Blazor.Bootstrap.Tests/Server/modules/ui/docs/TodoDocs.mjs"> TodoDocs.mjs </a> </li> </ul> </li> </ul> </li> </ul> Where you can enhance any of your APIs or DTOs with rich API docs by adding **Vue Components** to `/modules/ui/docs/*.mjs` which gets included together with API Explorers own components in its single file download. API Explorer is built using [Vue.js](https://vuejs.org/guide/introduction.html) which is a popular JavaScript framework that's optimal for progressive enhancement that your components can also take advantage of to enhance it with rich dynamic UIs. For auto registration of components the `*.mjs` should match the API doc component which needs to be named `{Type}Docs`. Here's a simple [CreateBookingDocs.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Blazor/tests/ServiceStack.Blazor.Bootstrap.Tests/Server/modules/ui/docs/CreateBookingDocs.mjs) component example used to generate API Docs for the `CreateBooking` API which just references **server** [AppMetadata](https://api.locode.dev/classes/shared.AppMetadata.html) to access server API info from the `/metadata/app.json` endpoint: ```js import { inject } from "vue" export const CreateBookingDocs = { template:` <div class="text-center my-3"> <div class="flex justify-center"> <svg class="w-10 h-10" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 48 48"> <path fill="#CFD8DC" d="M5 38V14h38v24c0 2.2-1.8 4-4 4H9c-2.2 0-4-1.8-4-4z"/><path fill="#F44336" d="M43 10v6H5v-6c0-2.2 1.8-4 4-4h30c2.2 0 4 1.8 4 4z"/> <g fill="#B71C1C"><circle cx="33" cy="10" r="3"/><circle cx="15" cy="10" r="3"/></g> <g fill="#B0BEC5"><path d="M33 3c-1.1 0-2 .9-2 2v5c0 1.1.9 2 2 2s2-.9 2-2V5c0-1.1-.9-2-2-2z"/><path d="M15 3c-1.1 0-2 .9-2 2v5c0 1.1.9 2 2 2s2-.9 2-2V5c0-1.1-.9-2-2-2z"/></g> <g fill="#90A4AE"><path d="M13 20h4v4h-4z"/><path d="M19 20h4v4h-4z"/><path d="M25 20h4v4h-4z"/><path d="M31 20h4v4h-4z"/><path d="M13 26h4v4h-4z"/><path d="M19 26h4v4h-4z"/><path d="M25 26h4v4h-4z"/><path d="M31 26h4v4h-4z"/><path d="M13 32h4v4h-4z"/><path d="M19 32h4v4h-4z"/><path d="M25 32h4v4h-4z"/><path d="M31 32h4v4h-4z"/></g> </svg> <h2 class="text-3xl ml-3 mb-3">Create Bookings API</h2> </div> <div class="text-gray-500 text-lg"> <p> Create a new room Booking for our {{serviceName}} hotels. </p> <p> Here are some <a class="svg-external text-blue-800" target="_blank" href="https://edition.cnn.com/travel/article/scoring-best-hotel-rooms/index.html"> good tips on making room reservations </a> </p> </div> </div> `, setup() { const server = inject('server') return { serviceName: server.app.serviceName } } } ``` ### Dynamic Components [QueryTodos](https://blazor-vue.web-templates.io/ui/QueryTodos?tab=details) is a more advanced example that generates a dynamic UI shared by all TODO APIs that generate its reactive **Mini Navigation UI** users can use to cycle through **all TODO API docs** with a `v-href="{ op }"` custom directive: <a href="https://blazor-vue.web-templates.io/ui/QueryTodos?tab=details" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-details-docs-Todos.png"> </a> Where instead of registering a single component, it exports a `install(app)` function to register API Doc components for all TODO CRUD APIs, all registered with the same `TodosDocs` component: ```js import { inject, computed } from "vue" import { humanize } from "@servicestack/client" export function install(app) { const apis = { QueryTodos: 'Query Todos, returns all Todos by default', CreateTodo: 'Create a Todo', UpdateTodo: 'Update a Todo', DeleteTodo: 'Delete Todo by Id', DeleteTodos: 'Delete multiple Todos by Ids', } const apiNames = Object.keys(apis) const TodosDocs = { template:` <div class="mx-auto max-w-screen-md text-center py-8"> <h2 class="text-center text-3xl">{{humanize(op.request.name)}}</h2> <p class="text-gray-500 text-lg my-3">{{apis[op.request.name]}}</p> <div class="flex justify-center text-left"> <table> <caption class="mt-3 text-lg font-normal">Other Todo APIs</caption> <tr v-for="(info,name) in otherApis"> <th class="text-right font-medium pr-3"> <a v-href="{ op:name }" class="text-blue-800">{{humanize(name)}}</a> </th> <td class="text-gray-500">{{info}}</td> </tr> </table> </div> </div>`, setup() { const store = inject('store') const op = computed(() => store.op) const otherApis = computed(() => apiNames.filter(x => x !== store.op.request.name) .reduce((acc,x) => { acc[x] = apis[x]; return acc }, {})) return { op, apis, otherApis, humanize, } } } const components = apiNames.reduce((acc, x) => { acc[x + 'Docs'] = TodosDocs; return acc }, {}) app.components(components) } ``` ## Built-in App functionality ### JavaScript Libraries Your custom components can utilize built in libraries embedded in ServiceStack.dll where they will have access to the latest [Vue 3](https://vuejs.org/guide/introduction.html) reactive fx, [@servicestack/client](/javascript-client) client library and [Vue 3 Tailwind Component library](/vue/) which they can import by package name, e.g: ```js import { ref } from "vue" import { useClient } from "@servicestack/vue" import { humanify } from "@servicestack/client" ``` **Static Analysis** As all package dependencies are written in TypeScript you can install them as dev dependencies to get static analysis from its TypeScript definitions at dev time: ```bash npm install -D vue npm install -D @servicestack/client npm install -D @servicestack/vue ``` Your components can access your Apps Typed DTOs directly from the [ES6 Module DTO endpoint](/javascript-add-servicestack-reference) at `/types/mjs`, e.g: ```js import { QueryCoupons } from "/types/mjs" ``` #### App functionality Your components access to most App functionality via the injected dependencies for functionality defined in API Explorer's [app.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/tests/NorthwindAuto/ui/lib/app.mjs): ```js const app = inject('app') // App for customizing Vue App, register components, providers, plugins, etc const client = inject('client') // JsonServiceClient for API Calls const server = inject('server') // AppMetadata (metadata for your Server App and APIs) const store = inject('store') // API Explorer's Reactive object model const routes = inject('routes') // usePageRoutes() Reactive store to manage its SPA routing const breakpoints = inject('breakpoints') // useBreakpoints() Reactive store to Tailwind responsive breakpoints ``` Most of which creates instance of common library features in [core.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/tests/NorthwindAuto/wwwroot/js/core.mjs) that are documented at [api.locode.dev/modules/explorer.html](https://api.locode.dev/modules/explorer.html). You're also not limited with what's in API Explorer, with full access to [JavaScript Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules) you can import external 3rd Party packages the same way you import built-in packages. ### Built-in API Docs ServiceStack's own built-in APIs uses custom API Doc components itself to document its APIs, e.g. [/ui/docs/RegisterDocs.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack/modules/ui/docs/RegisterDocs.mjs) ```js export const RegisterDocs = { template:` <div class="max-w-screen-md mx-auto text-center"> <h2 class="text-2xl font-medium mb-3">Register API</h2> <p class="text-gray-500"> Public API users can use to create a new User Account, can be added to your AppHost with: </p> <pre class="my-3"><code v-highlightjs="'Plugins.Add(new RegistrationFeature());'"></code></pre> </div>` } ``` Generates docs for the built-in **Register** API that includes **C#** Syntax highlighting using the pre-configured [highlightjs](https://highlightjs.org) directive: <a href="https://blazor-vue.web-templates.io/ui/Register?tab=details" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-details-docs-Register.png"> </a> Whilst [/ui/docs/AuthenticateDocs.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack/modules/ui/docs/AuthenticateDocs.mjs) demonstrates a more advanced example in generating a responsive dynamic tab layout containing multiple relevant ServiceStack Auth YouTube videos: <a href="https://blazor-vue.web-templates.io/ui/Authenticate?tab=details" class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/api-details-docs-Authenticate.png"> </a> ### Live Reload When your App is run with `dotnet watch` it takes advantage ASP .NET Core's built-in file watcher to enable an instant **live reload** developer UX when contributing API Docs. <div class="flex justify-center py-8"> <a href="https://youtu.be/lUDlTMq9DHU?t=521"> <img src="/img/pages/apiexplorer/api-docs-livereload.gif"> </a> </div> Which results in being more productive then using C# attributes as changes are immediately visible without a restart. ## Customizing API Explorer You can override each built-in Component in API Explorer by maintaining local customized versions in `/wwwroot/modules/ui` where each API can be documented by adding [Custom API Docs](/api-explorer#api-docs) to `/docs/*.mjs`, whilst existing components can be overridden in [/components/*.mjs](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack/src/ServiceStack/modules/ui/components) and custom UI added to `custom.*` ```files /wwwroot/modules/ui /docs *.mjs /components *.mjs custom.js custom.css custom.html ``` The `custom.html` and `custom.js` allows for further customization by including custom scripts and HTML fragments at the bottom of API Explorer which will let you modify API Explorer after it's loaded. ### Override built-in Components The built-in UIs also lets you override existing components by adding custom versions in [/js/components](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack/src/ServiceStack/js/components), e.g: ```js const Brand = { template:` <div class="flex items-center flex-shrink-0 max-w-sidebar"> <a title="My App" v-href="{ $page:'' }" class="text-2xl whitespace-nowrap overflow-x-hidden flex items-center"> <svg xmlns="http://www.w3.org/2000/svg" class="w-8 h-8 ml-1 mr-2" viewBox="0 0 24 24"> <path d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15z" fill="#5C2D91"/> </svg> My App </a> </div> ` } export default Brand ``` Which override's the built-in `Brand` component to replace the Logo on the top-right of API Explorer, [Locode](/locode/) and the [Admin UI](/admin-ui) with the custom version below: <div class="flex justify-center py-8"> <a href="https://blazor-vue.web-templates.io/ui/"> <img src="/img/pages/apiexplorer/brand-blazor-wasm.png" style="max-width:850px;border:1px solid rgb(229 231 235);"> </a> </div> Although a less invasive option if you just want to use your own logo is to configure the `UiFeature` plugin to override the default `BrandIcon` as the other Jamstack templates do in their [Configure.AppHost.cs](https://github.com/NetCoreTemplates/nextjs/blob/main/api/MyApp/Configure.AppHost.cs): ```csharp ConfigurePlugin<UiFeature>(feature => { feature.Info.BrandIcon.Uri = "/assets/img/logo.svg"; feature.Info.BrandIcon.Cls = "inline-block w-8 h-8 mr-2"; }); ``` <div class="flex justify-center py-8"> <a href="https://blazor-vue.web-templates.io/ui/Register"> <img src="/img/pages/apiexplorer/brand-vue-ssg.png" style="max-width:850px;border:1px solid rgb(229 231 235);"> </a> </div> ## Custom Form Layouts Generated forms default to a two column layout, but this can be controlled using `FormLayout` for a specific operation. The `appHost.ConfigureOperation<T>` method can be used to change the layout and order of the form used in API Explorer. For example, a `CreateCustomer` operation by default has the following properties. ```csharp [Route("/customers", "POST")] public class CreateCustomers : IReturn<IdResponse>, IPost, ICreateDb<Customers> { public string FirstName { get; set; } public string LastName { get; set; } public string Company { get; set; } public string Address { get; set; } public string City { get; set; } public string State { get; set; } public string Country { get; set; } public string PostalCode { get; set; } public string Phone { get; set; } public string Fax { get; set; } public string Email { get; set; } public long? SupportRepId { get; set; } } ``` ### Default Form UI And is presented in API Explorer using the following generated form by default. <a class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/create-customer-default.png"> </a> ### Custom Form Layout Customizing this layout using `ConfigureOperation`, we can control the placement and other attributed of each `InputInfo`. When overriding the `FormLayout`, it is in the structure of Rows, then columns in the nested list. So grouping controls like `City`, `State` and `PostalCode` in the same row allows us to control the presentation. ```csharp appHost.ConfigureOperation<CreateCustomers>(operation => operation.FormLayout = new() { Input.For<CreateCustomers>(x => x.FirstName, c => c.FieldsPerRow(2)), Input.For<CreateCustomers>(x => x.LastName, c => c.FieldsPerRow(2)), Input.For<CreateCustomers>(x => x.Email), Input.For<CreateCustomers>(x => x.Company), Input.For<CreateCustomers>(x => x.Address), Input.For<CreateCustomers>(x => x.City, c => c.FieldsPerRow(3)), Input.For<CreateCustomers>(x => x.State, c => c.FieldsPerRow(3)), Input.For<CreateCustomers>(x => x.PostalCode, c => c.FieldsPerRow(3)), Input.For<CreateCustomers>(x => x.Country), Input.For<CreateCustomers>(x => x.Phone, c => c.FieldsPerRow(2)), Input.For<CreateCustomers>(x => x.Fax, c => c.FieldsPerRow(2)), Input.For<CreateCustomers>(x => x.SupportRepId), }); ``` Gives us the updated layout in API Explorer. <a class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/create-customer-custom-layout.png"> </a> ### Custom Input Controls Each input field can be customized with client side visual and behavioural changes by using `InputInfo` when customizing `FormLayout`. ```csharp Input.For<CreateCustomers>(x => x.Email, info => { info.Label = "Personal Email Address"; info.Placeholder = "me@email.com"; info.Type = "email"; }) ``` Now our `label` and `placeholder` changes are visible and trying to submit a value without an `@` we get a client side warning. <a class="block my-8 p-4 rounded shadow hover:shadow-lg"> <img src="/img/pages/apiexplorer/create-customer-custom-input.png"> </a> Values for `InputInfo` are merged with the `[Input]` attribute that can be used on Request DTO class properties. This allows you to keep the default layout while still controlling `Input` options directly on your Request DTO class. ```csharp public class CreateCustomers : IReturn<IdResponse>, IPost, ICreateDb<Customers> { [Input(Placeholder = "me@email.com", Type = "email", Label = "Personal Email Address")] public string Email { get; set; } } ``` ### Register Form Layout The built-in `RegistrationFeature` also uses a custom Form layout to mask its password fields: ```csharp appHost.ConfigureOperation<Register>(op => op.FormLayout = new() { Input.For<Register>(x => x.DisplayName, x => x.Help = "Your first and last name"), Input.For<Register>(x => x.Email, x => x.Type = Input.Types.Email), Input.For<Register>(x => x.Password, x => x.Type = Input.Types.Password), Input.For<Register>(x => x.ConfirmPassword, x => x.Type = Input.Types.Password), }); ``` Which overrides the default Auto UI Form to use this custom layout: <div class="flex justify-center py-8"> <a href="https://blazor-vue.web-templates.io/ui/Register"> <img src="/img/pages/apiexplorer/api-form-Register.png" style="max-width:850px;"> </a> </div> --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/api-first-development.md --- slug: api-first-development title: API First Development --- One message we continually try to re-iterate is the importance of Services (aka APIs) having a well-defined coarse-grained Services Contract which serves as the interface into your system by which all external consumers bind to - making it the most important contract in your system. A strategy we recommend for maximizing re-use of your Services is to design them from an API-first point of view where all consumers (e.g. Desktop, Mobile and Web UIs) have equal accessibility to your services since they all consume the same published API's for all of their functionality. ## Benefits of Services This is the development model ServiceStack has always promoted and what most of its features are centered around, where your Services Contract is defined by decoupled impl-free DTOs. If your Services retain this property then they'll be able to encapsulate any of its capabilities of infinite complexity and make it available remotely to all consumers with never any more complexity than the cost of a Service call: ![](/img/pages/dtos-role.png) This is ultimately where most of the value of Services are derived, they're the ultimate form of encapsulating complexity and offers the highest level of software reuse. ServiceStack amplifies your Services capabilities by making them available in multiple [Hosting Options](/why-servicestack#multiple-hosting-options), [serialization formats](/why-servicestack#multiple-pluggable-formats), [MQ and SOAP endpoints](/why-servicestack#multiple-endpoints) to enable more seamless integrations in a variety of different scenarios including native end-to-end Typed APIs for the most popular [Web, Mobile and Desktop Apps](/why-servicestack#multiple-clients) that reduce the effort and complexity required to call your Services in all consumers - multiplicatively increasing the value provided. ## API First Development Model The typical practice in .NET has been you need to maintain **separate controllers** and logic for your **HTML UIs** and **API controllers** for your **HTTP APIs**. This approach forces code duplication and breaks your systems well-defined Service Contracts where any custom logic in your MVC Controllers and Razor pages becomes another entry point into your system where no longer are all your system capabilities available to all clients, some are only available when using a browser to navigate MVC pages. Whereas if you develop your APIs first, focusing instead of exposing your System's functionality behind pure-logic APIs, all clients including Web, Mobile, Desktop clients and B2B integrations will be able to utilize your same well-tested System Interfaces. In ServiceStack there are no "MVC Controllers" just for HTML pages, there are only Services, which are written with pure logic that's unopinionated as to what clients are calling it, with clean **Request DTOs** received as Inputs that typically return clean **Response DTOs** as outputs. HTML is then just another serialization format, providing a View of your Services or serving as a bundled UI that works on top of your existing Services, in all cases calling the same well tested and defined Services that all other clients use. For web development this means that UI logic and Error handling should ideally be utilizing the pure API Error Responses rather than behind server-side pages which gets easily coupled to your server implementation rather than your external published APIs. ### Multiple Web UI Validation Examples using same Services To better demonstrate the benefits of this approach and and show how there's no loss of flexibility, we've created the [World Validation](https://github.com/NetCoreApps/Validation) .NET Core App which uses the same pure unopinionated ServiceStack Services to support **8 different HTML UI strategies** including server HTML Rendered and Ajax Client forms, multiple View Engines, multiple layouts - all utilizing the same Services and declarative [Fluent Validation](/validation). [![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/apps/Validation/home.png)](https://github.com/NetCoreApps/Validation) <h4 align="center">View Source on GitHub <a href="https://github.com/NetCoreApps/Validation">NetCoreApps/Validation</a></h4> It should be noted that these are just examples of different HTML UIs, with no additional effort, all ServiceStack Services automatically provide native integrations into **all popular Mobile and Desktop Apps** with [Add ServiceStack Reference](/add-servicestack-reference). ## [World Validation](/world-validation) The annotated [World Validation Docs](/world-validation) walks through and showcases the implementation of how the most popular **Server HTML rendered** approaches and **Client UI rendered** technologies which are able all to use the same single suite of ServiceStack Services. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auth/api-key-authprovider.md --- slug: api-key-authprovider title: API Key Auth Provider --- The API Key Auth Provider provides an alternative method for allowing external 3rd Parties access to your protected Services without needing to specify a password. API Keys is the preferred approach for many well-known public API providers used in system-to-system scenarios for several reasons: - **Simple** - It integrates easily with existing HTTP Auth functionality - **Independent from Password** - Limits exposure to the much more sensitive master user passwords that should ideally never be stored in plain-text. Resetting User's Password or password reset strategies wont invalidate existing systems configured to use API Keys - **Entropy** - API Keys are typically much more secure than most normal User Passwords. The configurable default has **24 bytes** of entropy (Guids have 16 bytes) generated from a secure random number generator that encodes to **32 chars** using URL-safe Base64 (Same as Stripe) - **Performance** - Thanks to their much greater entropy and independence from user-chosen passwords, API Keys are validated as fast as possible using a datastore Index. This is contrast to validating hashed user passwords which as a goal require usage of slower and more computationally expensive algorithms to try make brute force attacks infeasible Like most ServiceStack providers the new API Key Auth Provider is simple to use, integrates seamlessly with ServiceStack existing Auth model and includes Typed end-to-end client/server support. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="NTCUT7atoLo" style="background-image: url('https://img.youtube.com/vi/NTCUT7atoLo/maxresdefault.jpg')"></lite-youtube> For familiarity and utility we've modeled our implementation around Stripe's API Key functionality whilst sharing many of the benefits of ServiceStack's Auth Providers: ## Simple and Integrated To register `ApiKeyAuthProvider` add it to the `AuthFeature` list of Auth Providers: ```csharp Plugins.Add(new AuthFeature(..., new IAuthProvider[] { new ApiKeyAuthProvider(AppSettings), new CredentialsAuthProvider(AppSettings), //... } )); ``` Or for Apps utilizing encapsulated [Modular Startup](/modular-startup) configuration blocks: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureAuth))] public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { appHost.Plugins.Add(new AuthFeature(() => new AuthUserSession(), new IAuthProvider[] { new ApiKeyAuthProvider(appHost.AppSettings), new CredentialsAuthProvider(appHost.AppSettings), //... })); }); } ``` The `ApiKeyAuthProvider` works similarly to the other ServiceStack `IAuthWithRequest` providers where a successful API Key initializes the current `IRequest` with the user's Authenticated Session. It also adds the [ApiKey](https://github.com/ServiceStack/ServiceStack/blob/c4a8f9741e496793d949c09cecb84e84fca86686/src/ServiceStack/Auth/ApiKeyAuthProvider.cs#L31) POCO Model to the request which can be accessed with: ```csharp ApiKey apiKey = req.GetApiKey(); ``` The `ApiKey` can be later inspected throughout the [request pipeline](/order-of-operations) to determine which API Key, Type and Environment was used. ## Interoperable Using existing HTTP Functionality makes it simple and interoperable to use with any HTTP Client even command-line clients like curl where API Keys can be specified in the **Username** of HTTP Basic Auth: :::sh curl https://api.stripe.com/v1/charges -u yDOr26HsxyhpuRB3qbG07qfCmDhqutnA: ::: Or as a HTTP Bearer Token in the **Authorization** HTTP Request Header: :::sh curl https://api.stripe.com/v1/charges -H "Authorization: Bearer yDOr26HsxyhpuRB3qbG07qfCmDhqutnA" ::: Both of these methods are built into most HTTP Clients. Here are a few different ways which you can send them using ServiceStack's [.NET Service Clients](/csharp-client): ```csharp var client = new JsonApiClient(baseUrl) { Credentials = new NetworkCredential(apiKey, "") }; var client = new JsonHttpClient(baseUrl) { BearerToken = apiKey }; ``` Or using the [HTTP Utils](/http-utils) extension methods: ```csharp var response = baseUrl.CombineWith("/secured").GetStringFromUrl( requestFilter: req => req.AddBasicAuth(apiKey, "")); var response = await "https://example.org/secured".GetJsonFromUrlAsync( requestFilter: req => req.AddBearerToken(apiKey)); ``` ## Sending API Key in Request DTOs Similar to the `IHasSessionId` interface Request DTOs can also implement `IHasBearerToken` to send Bearer Tokens, e.g: ```csharp public class Secure : IHasBearerToken { public string BearerToken { get; set; } public string Name { get; set; } } var response = client.Get(new Secure { BearerToken = apiKey, Name = "World" }); ``` Alternatively you can set the `BearerToken` property on the Service Client once where it will automatically populate all Request DTOs that implement `IHasBearerToken`, e.g: ```csharp client.BearerToken = jwtToken; var response = client.Get(new Secure { Name = "World" }); ``` ## Supported Auth Repositories The necessary functionality to support API Keys has been implemented in the following supported Auth Repositories: - `OrmLiteAuthRepository` - Supporting [most major RDBMS](/ormlite/#ormlite-rdbms-providers) - `RedisAuthRepository` - Uses Redis back-end data store - `DynamoDbAuthRepository` - Uses AWS DynamoDB data store - `MongoDbAuthRepository` - Uses MongoDB data store - `InMemoryAuthRepository` - Uses InMemory Auth Repository And requires no additional configuration as it just utilizes the existing registered `IAuthRepository`. ## Multiple API Key Types and Environments You can specify any number of different Key Types for use in multiple environments for each user. Keys are generated upon User Registration where it generates both a **live** and **test** key for the **secret** Key Type by default. To also create both a "secret" and "publishable" API Key, configure it with: ```csharp Plugins.Add(new AuthFeature(..., new IAuthProvider[] { new ApiKeyAuthProvider(AppSettings) { KeyTypes = new[] { "secret", "publishable" }, } } )); ``` If preferred, any of the API Key Provider options can instead be specified in [App Settings](/appsettings) following the `apikey.{PropertyName}` format, e.g: ```xml <add key="apikey.KeyTypes" value="secret,publishable" /> ``` ## Cached API Key Sessions You can reduce the number of I/O Requests and improve the performance of API Key Auth Provider Requests by specifying a `SessionCacheDuration` to temporarily store the Authenticated UserSession against the API Key which will reduce subsequent API Key requests down to 1 DB call to fetch and validate the API Key + 1 Cache Hit to restore the User's Session which if you're using the default in-memory Cache will mean it only requires 1 I/O call for the DB request. This can be enabled with: ```csharp Plugins.Add(new AuthFeature(..., new IAuthProvider[] { new ApiKeyAuthProvider(AppSettings) { SessionCacheDuration = TimeSpan.FromMinutes(10), } })); ``` ## Multitenancy Thanks to the ServiceStack's trivial support for enabling [Multitenancy](/multitenancy), the minimal configuration required to register and API Key Auth Provider that persists to a **LiveDb** SQL Server database and also allows Services called with an Test API Key to query the alternative **TestDb** database instead, is just: ```csharp class AppHost : AppSelfHostBase { public AppHost() : base("API Key Multitenancy Example", typeof(AppHost).Assembly) { } public override void Configure(Container container) { //Create and register an OrmLite DB Factory configured to use Live DB by default var dbFactory = new OrmLiteConnectionFactory( AppSettings.GetString("LiveDb"), SqlServerDialect.Provider); container.Register<IDbConnectionFactory>(dbFactory); // Register a "TestDb" Named Connection dbFactory.RegisterConnection("TestDb", AppSettings.GetString("TestDb"), SqlServerDialect.Provider); //Tell ServiceStack you want to persist User Auth Info in SQL Server container.Register<IAuthRepository>(c => new OrmLiteAuthRepository(dbFactory)); //Register the AuthFeature with the API Key Auth Provider Plugins.Add(new AuthFeature(() => new AuthUserSession(), new IAuthProvider[] { new ApiKeyAuthProvider(AppSettings) }); } public override IDbConnection GetDbConnection(IRequest req = null) { //If an API Test Key was used return DB connection to TestDb instead: return req.GetApiKey()?.Environment == "test" ? TryResolve<IDbConnectionFactory>().OpenDbConnection("TestDb") : base.GetDbConnection(req); } } ``` Now whenever a Test API Key was used to call an Authenticated Service, all `base.Db` Queries or AutoQuery Services will query **TestDb** instead. ## API Key Defaults The API Key Auth Provider has several options to customize its behavior with all but delegate Filters being able to be specified in AppSettings as well: ```csharp new ApiKeyAuthProvider { // Whether to only permit access via API Key from a secure connection. (default true) public bool RequireSecureConnection { get; set; } // Generate different keys for different environments. (default live,test) public string[] Environments { get; set; } // Different types of Keys each user can have. (default secret) public string[] KeyTypes { get; set; } // How much entropy should the generated keys have. (default 24) public int KeySizeBytes { get; set; } /// Whether to automatically expire keys. (default no expiry) public TimeSpan? ExpireKeysAfter { get; set; } // Automatically create ApiKey Table for Auth Repositories which need it. (true) public bool InitSchema { get; set; } // Change how API Key is generated public CreateApiKeyDelegate GenerateApiKey { get; set; } // Run custom filter after API Key is created public Action<ApiKey> CreateApiKeyFilter { get; set; } // Cache the User Session so it can be reused between subsequent API Key Requests public TimeSpan? SessionCacheDuration { get; set; } // Whether to allow API Keys in 'apikey' QueryString or FormData (e.g. `?apikey={APIKEY}`) public bool AllowInHttpParams { get; set; } } ``` ## IManageApiKeys API Should you need to, you can access API Keys from the Auth Repository directly through the following interface: ```csharp public interface IManageApiKeys { void InitApiKeySchema(); bool ApiKeyExists(string apiKey); ApiKey GetApiKey(string apiKey); List<ApiKey> GetUserApiKeys(string userId); void StoreAll(IEnumerable<ApiKey> apiKeys); } ``` ::: info This interface also defines what's required in order to implement API Keys support on a Custom AuthRepository ::: For Auth Repositories which implement it, you can access the interface by resolving `IAuthRepository` from the IOC and casting it to the above interface, e.g: ```csharp var apiRepo = (IManageApiKeys)HostContext.TryResolve<IAuthRepository>(); var apiKeys = apiRepo.GetUserApiKeys(session.UserAuthId); ``` ## Built-in API Key Services To give end-users access to their keys the API Key Auth Provider enables 2 Services: the `GetApiKeys` Service to return all valid User API Keys for the specified environment: ```csharp //GET /apikeys/live var response = client.Get(new GetApiKeys { Environment = "live" }); response.Results.PrintDump(); //User's "live" API Keys ``` And the `RegenerateApiKeys` Service to invalidate all current API Keys and generate new ones for the specified environment: ```csharp //POST /apikeys/regenerate/live var response = client.Post(new RegenerateApiKeys { Environment = "live" }); response.Results.PrintDump(); //User's new "live" API Keys ``` You can modify which built-in Services you want registered, or modify the custom routes to where you want them to be available by modifying the `ServiceRoutes` collection. E.g. you can prevent it from registering any Services by setting `ServiceRoutes` to an empty collection: ```csharp new ApiKeyAuthProvider { ServiceRoutes = new Dictionary<Type, string[]>() } ``` ## Generating API Keys for Existing Users Whilst the API Key Auth Provider will automatically generate API Keys for new users, if you also want to add API Keys for existing users you'll need to use the `ApiKeyAuthProvider` to generate new keys for all users that don't have keys. Here's a script you can use when using an `OrmLiteAuthRepository` to generate API Keys for all users with missing API Keys on startup: ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { appHost.Plugins.Add(new AuthFeature(() => new AuthUserSession(), new IAuthProvider[] { new ApiKeyAuthProvider(appHost.AppSettings) })); }, afterAppHostInit: appHost => { var authProvider = (ApiKeyAuthProvider) AuthenticateService.GetAuthProvider(ApiKeyAuthProvider.Name); using var db = appHost.TryResolve<IDbConnectionFactory>().Open(); var userWithKeysIds = db.Column<string>(db.From<ApiKey>() .SelectDistinct(x => x.UserAuthId)).Map(int.Parse); var userIdsMissingKeys = db.Column<string>(db.From<UserAuth>() // Use custom UserAuth if configured .Where(x => userWithKeysIds.Count == 0 || !userWithKeysIds.Contains(x.Id)) .Select(x => x.Id)); foreach (var userId in userIdsMissingKeys) { var apiKeys = authProvider.GenerateNewApiKeys(userId); authRepo.StoreAll(apiKeys); } }); } ``` :::info If using another Auth Repository backend this script will need to be modified to fetch the userIds for all users missing API Keys ::: ## .NET Framework Example Older platforms can register Startup initialization logic using the `HostContext.ConfigureAppHost()` singleton: ```csharp HostContext.ConfigureAppHost(afterAppHostInit:appHost => ...); ``` Or adding to `AfterInitCallbacks` in their `AppHost.Configure()`, e.g: ```csharp public override void Configure(Container container) { AfterInitCallbacks.Add(appHost => ...); } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auth/apikeys.md --- title: API Keys --- API Keys are a simple and effective way to authenticate and authorize access to your APIs, which are typically used for machine-to-machine communication, where a client application needs to access an API without user intervention. API Keys are often used to control access to specific resources or features in your API, providing a simple way to manage access control. ### Redesigning API Keys Building on our experience with API Keys in previous versions of ServiceStack, we've taken the opportunity to redesign how API Keys work to provide a more flexible and powerful way to manage access control for your APIs. The existing [API Key Auth Provider](https://docs.servicestack.net/auth/api-key-authprovider) was implemented as another Auth Provider that provided another way to authenticate a single user. The consequences of this was: - Initial API Request was slow as it required going through the Authentication workflow to authenticate the user and setup authentication for that request - No support for fine-grained access control as API Keys had same access as the authenticated user - API Keys had to be associated with a User which was unnecessary for machine-to-machine communication Given the primary use-case for API Keys is for machine-to-machine communication where the client is not a User, nor do they want systems they give out their API Keys to, to have access to their User Account, we've changed how API Keys work in .NET 10. ## .NET 10 API Keys Feature :::youtube U4vqOIHOs_Q New .NET 10 API Keys Feature with Built-In UIs! ::: The first design decision to overcome the above issues was to separate API Keys from Users and Authentication itself, where the new `ApiKeysFeature` is now just a plugin instead of an Auth Provider, which can be added to existing Identity Auth Apps with: :::sh npx add-in apikeys ::: Which will add the API Keys [Modular Startup](https://docs.servicestack.net/modular-startup) to your Host project, a minimal example of which looks like: ```csharp public class ConfigureApiKeys : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new ApiKeysFeature()); }) .ConfigureAppHost(appHost => { using var db = appHost.Resolve<IDbConnectionFactory>().Open(); var feature = appHost.GetPlugin<ApiKeysFeature>(); feature.InitSchema(db); }); } ``` Where it registers the `ApiKeysFeature` plugin and creates the `ApiKey` table in the App's configured database if it doesn't already exist. ### Creating Seed API Keys The plugin can also be used to programmatically generate API Keys for specified Users: ```csharp if (feature.ApiKeyCount(db) == 0) { var createApiKeysFor = new [] { "admin@email.com", "manager@email.com" }; var users = IdentityUsers.GetByUserNames(db, createApiKeysFor); foreach (var user in users) { // Create a super API Key for the admin user List<string> scopes = user.UserName == "admin@email.com" ? [RoleNames.Admin] : []; var apiKey = feature.Insert(db, new() { Name="Seed Key", UserId=user.Id, UserName=user.UserName, Scopes=scopes }); var generatedApiKey = apiKey.Key; } } ``` ### Basic Usage With the plugin registered, you can now use the `ValidateApiKey` attribute to limit APIs to only be accessible with a valid API Key, e.g: ```csharp [ValidateApiKey] public class MyRequest {} ``` ### Use API Keys with our without Users and Authentication API Keys can optionally be associated with a User, but they don't have to be, nor do they run in the context of a User or are able to invoke any Authenticated APIs on their own. Users who create them can also limit their scope to only call APIs they have access to, which can be done with user-defined scopes: ### Scopes Scopes are user-defined strings that can be used to limit APIs from only being accessible with API Keys that have the required scope. For example, we could create generate API Keys that have **read only**, **write only** or **read/write** access to APIs by assigning them different scopes, e.g: ```csharp public static class Scopes { public const string TodoRead = "todo:read"; public const string TodoWrite = "todo:write"; } [ValidateApiKey(Scopes.TodoRead)] public class QueryTodos : QueryDb<Todo> {} [ValidateApiKey(Scopes.TodoWrite)] public class CreateTodo : ICreateDb<Todo>, IReturn<Todo> {} [ValidateApiKey(Scopes.TodoWrite)] public class UpdateTodo : IUpdateDb<Todo>, IReturn<Todo> {} [ValidateApiKey(Scopes.TodoWrite)] public class DeleteTodos : IDeleteDb<Todo>, IReturnVoid {} ``` Where only API Keys with the `todo:read` scope can access the `QueryTodos` API, and only API Keys with the `todo:write` scope can access the `CreateTodo`, `UpdateTodo` and `DeleteTodos` APIs. APIs that aren't assigned a scope can be accessed by any valid API Key. The only built-in Scope is `Admin` which like the `Admin` role enables full access to all `[ValidateApiKeys]` APIs. ### Fine-grained Access Control Alternatively API Keys can be restricted to only be able to access specific APIs. ### Features In addition to scopes, API Keys can also be tagged with user-defined **Features** which APIs can inspect to enable different behavior, e.g. a **Paid** feature could be used to increase rate limits or return premium content whilst a **Tracking** feature could be used to keep a record of API requests, etc. These can be accessed in your Services with: ```csharp public object Any(QueryTodos request) { if (Request.GetApiKey().HasFeature(Features.Paid)) { // return premium content } } ``` ## Protect same APIs with API Keys or Identity Auth Modern APIs need to serve different types of clients with distinct authentication requirements. **Identity Auth** is designed for interactive user workflows with sessions, roles, and permissions, while **API Keys** excel at machine-to-machine communication with simple token-based authentication, superior performance, and fine-grained scope-based access control. ### Supporting both Auth Models with 2 APIs Previously, supporting both auth models required maintaining two separate APIs—one protected with `[ValidateIsAuthenticated]` and another with `[ValidateApiKey]` — resulting in duplicate endpoints and docs. ```csharp // For authenticated users [ValidateIsAuthenticated] public class QueryOrders : QueryDb<Order> { } // For API key access [ValidateApiKey] public class QueryOrdersApiKey : QueryDb<Order> { } public class OrderService : Service { public List<Order> Get(GetOrders request) { var userId = Request.GetRequiredUserId(); // Shared business logic } public List<Order> Get(GetOrdersViaApiKey request) => Get(request.ConvertTo<GetOrders>()); } public static class MyExtensions { public static string GetRequiredUserId(this IRequest? req) => req.GetApiKey()?.UserAuthId ?? req.GetClaimsPrincipal().GetUserId() ?? throw HttpError.Unauthorized("API Key must be associated with a user"); } ``` Whilst easy to implement, the biggest draw back with this approach is that it requires maintaining 2x APIs, 2x API endpoints, and 2x API docs. ### Allow API Key APIs to Authenticated Users From [ServiceStack v8.9](/releases/v8_09) you can protect the **same APIs and UIs** with both authentication models. By adding a user's API Key to the `apikey` claim in their Identity Auth Cookie, authenticated users can seamlessly access `[ValidateApiKey]` protected APIs without sending the key explicitly. This unified approach eliminates API duplication while maintaining all the benefits of both authentication paradigms: - Maintain a single API surface for all clients - Serve the same interactive UIs protected with Identity Auth or API Keys - Provide programmatic access via API Keys - Maintain all the benefits of API Keys To achieve this, users will need to have a valid API Key generated for them which would then need to be added to the `apikey` Claim in the `UserClaimsPrincipalFactory` to be included in their Identity Auth Cookie: ```csharp // Program.cs services.AddScoped<IUserClaimsPrincipalFactory<ApplicationUser>, AdditionalUserClaimsPrincipalFactory>(); // Add additional claims to the Identity Auth Cookie public class AdditionalUserClaimsPrincipalFactory( UserManager<ApplicationUser> userManager, RoleManager<IdentityRole> roleManager, IApiKeySource apiKeySource, IOptions<IdentityOptions> optionsAccessor) : UserClaimsPrincipalFactory<ApplicationUser,IdentityRole>( userManager, roleManager, optionsAccessor) { public override async Task<ClaimsPrincipal> CreateAsync(ApplicationUser user) { var principal = await base.CreateAsync(user); var identity = (ClaimsIdentity)principal.Identity!; var claims = new List<Claim>(); if (user.ProfileUrl != null) { claims.Add(new Claim(JwtClaimTypes.Picture, user.ProfileUrl)); } // Add Users latest valid API Key to their Auth Cookie's 'apikey' claim var latestKey = (await apiKeySource.GetApiKeysByUserIdAsync(user.Id)) .OrderByDescending(x => x.CreatedDate) .FirstOrDefault(); if (latestKey != null) { claims.Add(new Claim(JwtClaimTypes.ApiKey, latestKey.Key)); } identity.AddClaims(claims); return principal; } } ``` After which Authenticated Users will be able to access `[ValidateApiKey]` protected APIs where it attaches the API Key in the `apikey` Claim to the request - resulting in the same behavior had they sent their API Key with the request. ```csharp // For authenticated users or API Keys [ValidateApiKey] public class QueryOrders : QueryDb<Order> { } ``` ## Integrated UIs Like many of ServiceStack's other premium features, API Keys are fully integrated into [ServiceStack's built-in UIs](https://servicestack.net/auto-ui) including [API Explorer](https://docs.servicestack.net/api-explorer) and the [Admin UI](https://docs.servicestack.net/admin-ui). ### API Explorer Your Users and API Consumers can use API Explorer to invoke protected APIs with their API Key. API Key protected APIs will display a **key** icon next to the API instead of the **padlock** which is used to distinguish APIs that require Authentication. Users can configure API Explorer with their API Key by either clicking the **key** icon on the top right or by clicking the **API Key** link on the alert message that appears when trying to access an API requiring an API Key: ![](/img/pages/auth/apikeys/apiexplorer-apikeys.png) Both of these will open the **API Key** dialog where they can paste their API Key: ![](/img/pages/auth/apikeys/apiexplorer-apikeys-dialog.png) :::info NOTE API Keys are not stored in localStorage and only available in the current session ::: ### Admin UI Whilst **Admin** users can view and manage API Keys in the API Key [Admin UI](https://docs.servicestack.net/admin-ui) at: :::{.text-4xl .text-center .text-indigo-800} /admin-ui/apikeys ::: ![](/img/pages/auth/apikeys/admin-ui-apikeys.png) This will let you view and manage all API Keys in your App, including the ability to revoke API Keys, extend their Expiration date as well as manage any Scopes and Features assigned to API Keys. ### Customizing API Key UIs The `ApiKeysFeature` plugin can be configured to specify which **Scopes** and **Features** can be assigned to API Keys as well as the different Expiration Options you want available in the API Key management UIs, e.g: ```csharp services.AddPlugin(new ApiKeysFeature { // Optional: Available Scopes Admin Users can assign to any API Key Features = [ Features.Paid, Features.Tracking, ], // Optional: Available Features Admin Users can assign to any API Key Scopes = [ Scopes.TodoRead, Scopes.TodoWrite, ], // Optional: Limit available Expiry options that can be assigned to API Keys // ExpiresIn = [ // new("", "Never"), // new("7", "7 days"), // new("30", "30 days"), // new("365", "365 days"), // ], }); ``` ### Admin User API Keys When the `ApiKeysFeature` plugin is registered, the [User Admin UI](https://docs.servicestack.net/admin-ui-identity-users) will be enhanced to include the ability to create and manage API Keys for the user at the bottom of the **Edit User** form: ![](/img/pages/auth/apikeys/admin-ui-user-apikeys.png) #### Creating User API Keys When creating API Keys, you can assign them a **Name**, its **Expiration** date and any **Scopes**, **Features** and **Notes**. ![](/img/pages/auth/apikeys/admin-ui-user-apikeys-create.png) ### Restrict to APIs `Scopes` provide a simple way to logically group a collection of related APIs behind UX-friendly names without Users needing to know the behavior of each individual API. In addition, Users who want fine-grained control can also restrict API Keys to only be able to access specific APIs that their systems make use of by selecting them from the **Restrict to APIs** option: ![](/img/pages/auth/apikeys/apikeys-restrict-to.png) #### One Time only access of generated API Key All UIs limit access to the generated API Key token so that it's only accessible at the time of creation: ![](/img/pages/auth/apikeys/admin-ui-user-apikeys-create-dialog.png) #### Editing User API Keys Everything about the API Key can be edited after it's created except for the generated API Key token itself, in addition to be able to cancel and revoke the API Key: ![](/img/pages/auth/apikeys/admin-ui-user-apikeys-edit.png) Invalid API Keys that have expired or have been disabled will appear disabled in the UI: ![](/img/pages/auth/apikeys/admin-ui-user-apikeys-disabled.png) ## User Management API Keys In addition to the built-in Admin UIs to manage API Keys, all Identity Auth Tailwind templates have also been updated to include support for managing API Keys in their User Account pages: <div class="not-prose mt-8 grid grid-cols-2 gap-4"> <a class="block group border dark:border-gray-800 hover:border-indigo-700 dark:hover:border-indigo-700" href="https://blazor-vue.web-templates.io"> <div style="max-height:350px;overflow:hidden"> <img class="p-2" src="https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/blazor-vue.png"></div> <div class="bg-gray-50 dark:bg-gray-800 text-gray-600 dark:text-gray-300 font-semibold group-hover:bg-indigo-700 group-hover:text-white text-center py-2">blazor-vue.web-templates.io</div> </a> <a class="block group border dark:border-gray-800 hover:border-indigo-700 dark:hover:border-indigo-700" href="https://razor.web-templates.io"> <div style="max-height:350px;overflow:hidden"> <img class="p-2" src="https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/razor.png"></div> <div class="bg-gray-50 dark:bg-gray-800 text-gray-600 dark:text-gray-300 font-semibold group-hover:bg-indigo-700 group-hover:text-white text-center py-2">razor.web-templates.io</div> </a> <a class="block group border dark:border-gray-800 hover:border-indigo-700 dark:hover:border-indigo-700" href="https://mvc.web-templates.io"> <div style="max-height:350px;overflow:hidden"> <img class="p-2" src="https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/mvc.png"></div> <div class="bg-gray-50 dark:bg-gray-800 text-gray-600 dark:text-gray-300 font-semibold group-hover:bg-indigo-700 group-hover:text-white text-center py-2">mvc.web-templates.io</div> </a> </div> The templates aren't configured to use API Keys by default, but new projects can be configured to use API Keys by selecting the **API Keys** feature on the [Start Page](/start): [![](/img/pages/auth/apikeys/start-apikeys.png)](/start) Or by mixing the `apikeys` project in your host project: :::sh npx add-in apikeys ::: Which add the `Configure.ApiKeys.cs` modular startup to your Host project, which registers the `ApiKeysFeature` plugin where you'd use the `UserScopes` and `UserFeatures` collections instead to control which scopes and features Users can assign to their own API Keys, e.g: ```csharp services.AddPlugin(new ApiKeysFeature { // Optional: Available Scopes Admin Users can assign to any API Key Features = [ Features.Paid, Features.Tracking, ], // Optional: Available Features Admin Users can assign to any API Key Scopes = [ Scopes.TodoRead, Scopes.TodoWrite, ], // Optional: Limit available Scopes Users can assign to their own API Keys UserScopes = [ Scopes.TodoRead, ], // Optional: Limit available Features Users can assign to their own API Keys UserFeatures = [ Features.Tracking, ], }); ``` ### Identity Auth API Keys When enabled users will be able to create and manage their own API Keys from their Identity UI pages which will use any configured `UserScopes` and `UserFeatures`: ![](/img/pages/auth/apikeys/identity-auth-apikeys.png) ### Client Usage Like most API Key implementations, API Keys can be passed in a [HTTP Authorization Bearer Token](https://datatracker.ietf.org/doc/html/rfc6750#section-2.1) that can be configured in ServiceStack Service Clients with: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = apiKey }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = apiKey ``` ### API Key HTTP Header Alternatively, API Keys can also be passed in the `X-Api-Key` HTTP Header which allows clients to be configured with an alternative Bearer Token allowing the same client to call both **Authenticated** and **API Key** protected APIs, e.g: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = jwt, Headers = { [HttpHeaders.XApiKey] = apiKey } }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = apiKey client.headers.set('X-Api-Key', apiKey) ``` Or use a different HTTP Header by configuring `ApiKeysFeature.HttpHeader`, e.g: ```csharp services.AddPlugin(new ApiKeysFeature { HttpHeader = "X-Alt-Key" }); ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/app-tasks.md --- title: App Tasks --- App Tasks let you run one-off tasks with the full context of your App but without the overhead of maintaining a separate **.exe** with duplicated App configuration. With App Tasks you can run your ASP .NET Core App, run the specified Tasks then exit before launching its HTTP Server. ### Example The `AppTasks` static class can be used to register user-defined tasks, e.g: ```csharp var runner = new TaskRunner(app.Services); AppTasks.Register("task1", args => runner.Task1(args)); AppTasks.Register("task2", args => runner.Task2(args)); AppTasks.Run(); app.Run(); ``` Which can then be run from the command-line with: :::sh dotnet run --AppTasks=task1:arg1,arg2;task2:arg1,arg2 ::: Which will run the tasks in the specified order, before immediately exiting. If any of the tasks fail the command will return the 1-based index of the task that failed, otherwise it will return a **0** success result. ## DB Migration App Task For a more complete example we'll look at how [code-first DB Migrations](/ormlite/db-migrations) uses App Tasks to run DB Migrations from the command-line. ### Running migrations from command-line To be able to run from migrations from the command line, DB Migrations needs access to your App's DB configuration. The best way to do this is to run your App normally then access the configured `IDbConnectionFactory` from the IOC, perform the migrations then exit with either a success or failure error code. To do this we've added support for **AppTasks** which let you define tasks in your App that you can run from the command-line. This will let you perform migrations in a separate stage to check migrations were successful before running your App. ### Configuring existing Projects You can add DB Migration support to existing projects by applying the [migrations](https://gist.github.com/gistlyn/50df00df4b3b9faa94a73d32ab4b2484) gist to your project with: :::sh npx add-in migrations ::: This will register the Migration **AppTasks** with your App via a [Modular Startup](/modular-startup) configuration: ```csharp public class ConfigureDbMigrations : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(afterAppHostInit:appHost => { var migrator = new Migrator(appHost.Resolve<IDbConnectionFactory>(), typeof(Migration1000).Assembly); AppTasks.Register("migrate", _ => migrator.Run()); AppTasks.Register("migrate.revert", args => migrator.Revert(args[0])); AppTasks.Run(); }); } ``` Here we can see we need to configure our `Migrator` with the `IDbConnectionFactory` we want to run against and the assemblies where all our Migration classes are maintained. It also registers an AppTasks to **migrate** our App by comparing the **Migration** table with the Migrations in the specified Assembly to workout which Migrations are left to run (in order) and the **revert** AppTask to do the opposite and revert to the specified migration. This will now let us run your app in **"Task Mode"** where it will execute the specified task before promptly exiting with a **0** success exit code if successful or the index of the task that failed, e.g. **1** if the first Task failed. ### dotnet Migration Tasks We can then execute our App Task by running our App with the `AppTasks` command-line argument of the Task we want to run, so we can run all pending migrations with: :::sh dotnet run --AppTasks=migrate ::: The format to revert a migration is: ```bash $ dotnet run --AppTasks=migrate.revert:<name> ``` Where **name** is either the class name of the Migration you want to revert to (inclusive) or you can use **last** to revert the last migration: :::sh dotnet run --AppTasks=migrate.revert:last ::: or **all** to revert all migrations: :::sh dotnet run --AppTasks=migrate.revert:all ::: ### npm Migration Scripts To make this easier to remember and use, these tasks are also added as npm scripts: ```json { "scripts": { "migrate": "dotnet run --AppTasks=migrate", "revert:last": "dotnet run --AppTasks=migrate.revert:last", "revert:all": "dotnet run --AppTasks=migrate.revert:all", "rerun:last": "npm run revert:last && npm run migrate" } } ``` Which can be run with: ```bash $ npm run migrate $ npm run revert:last $ npm run revert:all $ npm run rerun:last ``` Which Rider provides a nice UX for running directly from the IDE where it will print all executed SQL output in a dedicated Console: ![](/img/pages/ormlite/migration-scripts.png) ### ASP .NET Core Projects General (i.e. non-ServiceStack) ASP.NET Core Apps can instead configure AppTasks before `app.Run()` in their **Program.cs**: ```csharp var migrator = new Migrator(app.Services.Resolve<IDbConnectionFactory>(), typeof(Migrations.Migration1000).Assembly); AppTasks.Register("migrate", _ => migrator.Run()); AppTasks.Register("migrate.revert", args => migrator.Revert(args[0])); AppTasks.Run(); app.Run(); ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/appsettings.md --- slug: appsettings title: Configuration & AppSettings --- Instead of building verbose nested XML configSection classes our preference is to instead store structured configuration in Web.config's `<appSetting/>` which can still express rich config graphs but in a much more human-friendly and manageable way. ServiceStack's pluggable `IAppSettings` API is a cleaner alternative for storing your Application structured configuration, providing a high-level API to read your Web.config's `<appSetting/>` values into a `List`, `Dictionary` or your own clean Custom POCO Types using the human friendly [JSV format](/jsv-format). ```csharp public interface IAppSettings { Dictionary<string, string> GetAll(); List<string> GetAllKeys(); bool Exists(string key); void Set<T>(string key, T value); string GetString(string name); IList<string> GetList(string key); IDictionary<string, string> GetDictionary(string key); T Get<T>(string name); T Get<T>(string name, T defaultValue); } ``` Benefits over existing Configuration API include the ability to store rich data structures in appSettings values, more succinct access to typed data and since its an interface it's decoupled from .NET Configuration classes and can easily be swapped to source your configuration from an different sources without a rewrite, e.g. from a text file or central DB. ### Example Usage ```xml <appSettings> <add key="LastUpdated" value="01/01/2012 12:00:00" /> <add key="AllowedUsers" value="Tom,Mick,Harry" /> <add key="RedisConfig" value="{Host:localhost,Port:6379,Database:1,Timeout:10000}" /> </appSettings> ``` Reading the above configuration in code: ```csharp IAppSettings appSettings = new AppSettings(); DateTime lastUpdate = appSettings.Get<DateTime>("LastUpdated"); IList<string> allowedUsers = appSettings.GetList("AllowedUsers"); RedisConfig redisConf = appSettings.Get<RedisConfig>("RedisConf"); //use default value if no config exists var searchUrl = appSettings.Get("SearchUrl", "http://www.google.com"); ``` The last default value provides a convenient way to maintain workable default options in code (allowing re-use in Unit/Integration tests) whilst still being overridable in the **Web.config** when you need to. ## Multi AppSettings The `MultiAppSettings` AppSettings provider enables reading configuration from multiple configuration sources. ### Example Usage The example below creates a cascading configuration that first checks Environment variables, then looks in a local `~/appsettings.txt` plain-text file before falling back to `Web.config`: ```csharp AppSettings = new MultiAppSettings( new EnvironmentVariableSettings(), new TextFileSettings("~/appsettings.txt".MapHostAbsolutePath()), new AppSettings()); ``` ### Multi AppSettings Builder An alternative is to use `MultiAppSettingsBuilder` if you prefer to use a fluent discoverable API: ```csharp AppSettings = new MultiAppSettingsBuilder() .AddAppSettings() .AddDictionarySettings(new Dictionary<string,string> { "override" : "setting" }) .AddEnvironmentalVariables() .AddTextFile("~/path/to/settings.txt".MapProjectPath()) .Build(); ``` ## OrmLite AppSettings `OrmLiteAppSettings` provides an alternative read/write API that lets you maintain your applications configuration in any [RDBMS back-end OrmLite supports](/ormlite/). It works like a mini Key/Value database in which can store any serializable value against any key which is maintained into the simple Id/Value `ConfigSettings` table. ### Usage Registration just uses an OrmLite DB Factory, e.g: ```csharp container.Register(c => new OrmLiteAppSettings(c.Resolve<IDbConnectionFactory>())); //Create the ConfigSettings table if it doesn't exist container.Resolve<OrmLiteAppSettings>().InitSchema(); ``` It then can be accessed like any [AppSetting APIs](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Common.Tests/Configuration/AppSettingsTests.cs). The example below reads the `MyConfig` POCO stored at `config` otherwise use default value if it doesn't exist: ```csharp var config = appSettings.Get("config", new MyConfig { Key = "DefaultValue" }); ``` In addition to the AppSettings read-only API's, it also supports writing config values , e.g: ```csharp var latestStats = appSettings.GetOrCreate("stats", () => statsProvider.GetLatest()); ``` ## EnvironmentVariableSettings The new `EnvironmentVariableSettings` AppSettings provider to source configuration from Environment variables: ```csharp var appSettings = new EnvironmentVariableSettings(); ``` ## TextFileSettings The `TextFileSettings` lets you read your Applications configuration in a plain-text file, which can easily be overridden with custom environment settings as part of the CI deployment process, providing a nice alternative to custom Web.config configurations. ### Example Usage To use just provide the path to the plain-text file that contains the app-settings: ```csharp var appSettings = new TextFileSettings("~/app.settings".MapHostAbsolutePath()); ``` ### TextFile Format Each appSetting is on a new line with the **Key** and **Value** separated by a space: ``` {Key} {Value}\n ``` > The delimiter can be changed in the constructor e.g. `new TextFileSettings(path,delimiter:": ");` ### Extract key / value settings from text file Under the hood TextFileSettings uses the ParseKeyValueText extension method to extract key / value data from a string, e.g: ```csharp var configText = @" # comments starting with '#' and blank lines are ignored StringKey string value IntKey 42 ListKey A,B,C,D,E DictionaryKey A:1,B:2,C:3,D:4,E:5 PocoKey {Foo:Bar,Key:Value}"; Dictionary<string, string> configMap = configText.ParseKeyValueText(delimiter:" "); ``` ## DictionarySettings When combined with the existing `DictionarySettings`, enables a rich, simple and clean alternative to .NET's App.config config section for reading structured configuration into clean data structures, e.g: ```csharp IAppSettings settings = new DictionarySettings(configMap); string value = settings.Get("StringKey"); int value = settings.Get("IntKey", defaultValue:1); List<string> values = settings.GetList("ListKey"); Dictionary<string,string> valuesMap = settings.GetDictionary("key"); MyConfig config = settings.Get("key", new MyConfig { Key = "default"}); ``` ### SimpleAppSettings `SimpleAppSettings` is an alternative Dictionary-based provider that only requires a dependency to `ServiceStack.Common`, e.g: ```csharp AppSettings = new SimpleAppSettings(new Dictionary<string, string> { ["string"] = "value", ["EnableFeature.1"] = "true", ["AllowedUsers"] = "Tom,Mick,Harry", })); string value = AppSettings.GetString("string"); bool enableFeature1 = AppSettings.Get("EnableFeature.1", defaultValue:false); bool enableFeature2 = AppSettings.Get("EnableFeature.2", defaultValue:false); IList<string> allowedUsers = AppSettings.GetList("AllowedUsers"); ``` ## [DynamoDbAppSettings](https://github.com/ServiceStack/ServiceStack/blob/master/docs/2015/release-notes.md#dynamodbappsettings) Storing production config in DynamoDB reduces the effort for maintaining production settings decoupled from source code. Here `DynamoDbAppSettings` is registered first in a `MultiAppSettings` collection it checks entries in the DynamoDB `ConfigSetting` Table first before falling back to local **Web.config** appSettings: ```csharp #if !DEBUG AppSettings = new MultiAppSettings( new DynamoDbAppSettings(new PocoDynamo(awsDb), initSchema:true), new AppSettings()); // fallback to Web.confg #endif ``` ## First class AppSettings After proving its value over the years we've decided to make it a first-class property on `IAppHost.AppSettings` which defaults to looking at .NET's App/Web.config's. The new [Chat.zip](https://github.com/ServiceStack/ServiceStack.Gap/raw/master/deploy/Chat.zip) App explores different ways AppSettings can be used: If there's an existing `appsettings.txt` file where the **.exe** is run it will use that, otherwise it falls back to **Web.config** appSettings: ```csharp public AppHost() : base("Chat", typeof (ServerEventsServices).Assembly) { var customSettings = new FileInfo("appsettings.txt"); AppSettings = customSettings.Exists ? (IAppSettings)new TextFileSettings(customSettings.FullName) : new AppSettings(); } ``` As a normal property in your AppHost, AppSettings can be accessed directly in `AppHost.Configure()`: ```csharp public void Configure(Container container) { ... var redisHost = AppSettings.GetString("RedisHost"); if (redisHost != null) { container.Register<IServerEvents>(c => new RedisServerEvents(new PooledRedisClientManager(redisHost))); container.Resolve<IServerEvents>().Start(); } } ``` Inside your services or IOC dependencies, like any other auto-wired dependency: ```csharp public class ServerEventsServices : Service { public IAppSettings AppSettings { get; set; } public void Any(PostRawToChannel request) { if (!IsAuthenticated && AppSettings.Get("LimitRemoteControlToAuthenticatedUsers", false)) throw new HttpError(HttpStatusCode.Forbidden, "You must be authenticated to use remote control."); ... } } ``` Directly within Razor views: ```html <style> body { background-image: url(@AppSettings.Get("background","/img/bg.jpg")) } </style> ``` As well as outside ServiceStack, via the `HostContext` static class: ```csharp var redisHost = HostContext.AppSettings.GetString("redis"); ``` ## AppSettings are Writable A new `Set()` API was added to [IAppSettings](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Configuration/IAppSettings.cs) letting you save any serializable property that works for all providers: ```csharp public interface IAppSettings { void Set<T>(string key, T value); ... } AppSettings.Set("Poco", new MyConfig { Foo = "Baz" }); ``` In providers that support writable configuration natively like `OrmLiteAppSettings` and `DictionarySettings`, the settings get written through to the underlying provider. For read-only providers like Web.config's `AppSettings` or `TextFileSettings` a **shadowed** cache is kept that works similar to prototypal shadowing in JavaScript where if a property doesn't exist, setting a property will be stored on the top-level object instance which also takes precedence on subsequent property access. # IConfiguration To create AppSettings from IConfiguration object ```csharp AppSettings = new NetCoreAppSettings(configuration); ``` # Community AppSettings ## [ServiceStack.Configuration.Consul](https://github.com/MacLeanElectrical/servicestack-configuration-consul) An implementation of IAppSettings that uses Consul.io key/value store as backing storage --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/architecture-overview.md --- title: Architecture Overview --- Ultimately behind-the-scenes ServiceStack is just built on top of ASP.NET's Raw [IHttpAsyncHandler](https://msdn.microsoft.com/en-us/library/ms227433.aspx). Existing abstractions and [xmlconfig-encumbered legacy ASP.NET providers](http://mono.servicestack.net/mvc-powerpack/) have been abandoned, in favour of fresh, simple and clean [Caching](/caching), [Session](/auth/sessions) and [Authentication](/auth/authentication-and-authorization) providers all based on clean POCOs, supporting multiple back-ends and all working seamlessly together. Our best-practices architecture is purposely kept simple, introduces minimal new concepts or artificial constructs that can all be eloquently captured in the diagram below: ## Server Architecture ![ServiceStack Logical Architecture View](/img/pages/overview/servicestack-logical-view-02.png) ## Client Architecture ServiceStack's [Message-based design](/advantages-of-message-based-web-services) allows us to easily support [typed, generic and re-usable Service Clients](/clients-overview) for all our popular formats: ![ServiceStack HTTP Client Architecture](/img/pages/overview/servicestack-httpclients.png) Having all clients share the same interface allow them to be hot-swappable at run-time without code changes and keep them highly testable where the same unit test can also [serve as an XML, JSON, JSV, SOAP Integration Test](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.IntegrationTests/Tests/WebServicesTests.cs). By promoting clean (endpoint-ignorant and dependency-free) Service and DTO classes, your web services are instantly re-usable and can be hosted in non-http contexts as well. E.g. The client architecture when one of the [built-in MQ Host is enabled](/redis-mq): ![ServiceStack MQ Client Architecture](/img/pages/overview/servicestack-mqclients.png) ## Implementation The entry point for all ASP.NET and HttpListener requests is in the [ServiceStack.HttpHandlerFactory](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/HttpHandlerFactory.cs) whose purpose is to return the appropriate IHttpHandler for the incoming request. There are 2 distinct modes in any ServiceStack application: 1. AppHost Setup and Configuration - Only done once for all services. Run only once on App StartUp. 1. Runtime - Run on every request: uses dependencies, plugins, etc. defined in the AppHost. Each new request re-binds all IOC dependencies to a new service instance which gets disposed at the end of each request. The implementation of this can be visualized below: ![ServiceStack Overview](/img/pages/overview/servicestack-overview-01.png) After the `IHttpHandler` is returned, it gets executed with the current ASP.NET or HttpListener request wrapped in a common [IRequest](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IRequest.cs) instance. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ormlite/async-apis.md --- title: OrmLite Async API --- A quick overview of Async API's can be seen in the class diagram below: ![OrmLite Async APIs](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/ormlite/OrmLiteApiAsync.png) Essentially most of OrmLite public API's now have async equivalents of the same name and an additional conventional `*Async` suffix. The Async API's also take an optional `CancellationToken` making converting sync code trivial, where you just need to add the `Async` suffix and **await** keyword, as can be seen in the [Customer Orders UseCase upgrade to Async diff](https://github.com/ServiceStack/ServiceStack.OrmLite/commit/c1ce6f0eac99133fc232b263c26c42379d4c5f48) , e.g: Sync: ```csharp db.Insert(new Employee { Id = 1, Name = "Employee 1" }); db.Save(product1, product2); var customer = db.Single<Customer>(new { customer.Email }); ``` Async: ```csharp await db.InsertAsync(new Employee { Id = 1, Name = "Employee 1" }); await db.SaveAsync(product1, product2); var customer = await db.SingleAsync<Customer>(new { customer.Email }); ``` ::: info Effectively the only Data Access API's that doesn't have async equivalents are `*Lazy` APIs yielding a lazy sequence (incompatible with async) as well as **Schema** DDL API's which are typically not used at runtime. ::: For a quick preview of many of the new Async API's in action, checkout [ApiSqlServerTestsAsync.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.OrmLite/tests/ServiceStack.OrmLite.Tests/Async/ApiSqlServerTestsAsync.cs). ## Async RDBMS Providers Currently, only a limited number of RDBMS providers offer async API's, which at this time are only: - [SQL Server .NET 4.7.2+](https://www.nuget.org/packages/ServiceStack.OrmLite.SqlServer) - [PostgreSQL .NET 4.7.2+](https://www.nuget.org/packages/ServiceStack.OrmLite.PostgreSQL) - [MySQL .NET 4.7.2+](https://www.nuget.org/packages/ServiceStack.OrmLite.MySql) We've also added a [.NET 4.7.2 build for Sqlite](https://www.nuget.org/packages/ServiceStack.OrmLite.Sqlite) as it's a common use-case to swapout to use Sqlite's in-memory provider for faster tests. But as Sqlite doesn't provide async API's under-the-hood we fallback to *pseudo async* support where we just wrap its synchronous responses in `Task` results. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ormlite/async-tasks-builder.md --- title: RDBMS Async Tasks Builder --- ### Sequential Async DB Access Async improves I/O thread utilization in multi-threaded apps like Web Servers. However, it doesn't improve the performance of individual API Requests that need to execute multiple independent DB Requests. These are often written to run async db access sequentially like this: ```csharp var rockstars = await Db.SelectAsync<Rockstar>(); var albums = await Db.SelectAsync<Album>(); var departments = await Db.SelectAsync<Department>(); var employees = await Db.SelectAsync<Employee>(); ``` The issue being that it's not running them in parallel as each DB Request is executed sequentially with the Request for Albums not starting until the Request for Rockstars has completed. To run them in parallel you would need to open multiple scoped DB Connections, await them concurrently then do the syntax boilerplate gymnastics required to extract the generic typed results, e.g: ```csharp var connections = await Task.WhenAll( DbFactory.OpenDbConnectionAsync(), DbFactory.OpenDbConnectionAsync(), DbFactory.OpenDbConnectionAsync(), DbFactory.OpenDbConnectionAsync() ); using var dbRockstars = connections[0]; using var dbAlbums = connections[1]; using var dbDepartments = connections[2]; using var dbEmployees = connections[3]; var tasks = new List<Task> { dbRockstars.SelectAsync<Rockstar>(), dbAlbums.SelectAsync<Album>(), dbDepartments.SelectAsync<Department>(), dbEmployees.SelectAsync<Employee>() }; await Task.WhenAll(tasks); var rockstars = ((Task<List<Rockstar>>)tasks[0]).Result; var albums = ((Task<List<Album>>)tasks[1]).Result; var departments = ((Task<List<Department>>)tasks[2]).Result; var employees = ((Task<List<Employee>>)tasks[3]).Result; ``` Even without Error handling, writing coding like this can quickly become tedious, less readable and error prone that as a result is rarely done. ### Parallel DB Requests in TypeScript This is easier to achieve in languages like TypeScript where typed ORMs like [litdb.dev](https://litdb.dev) can run multiple DB Requests in parallel with just: ```csharp const [rockstars, albums, departments, employees] = await Promise.all([ db.all<Rockstar>($.from(Rockstar)), //= Rockstar[] db.all<Album>($.from(Album)), //= Album[] db.all<Department>($.from(Department)), //= Department[] db.all<Employee>($.from(Employee)), //= Employee[] ]) ``` Which benefits from TypeScript's powerful type system that allows destructuring arrays whilst preserving their positional types, whilst its single threaded event loop lets you reuse the same DB Connection to run DB Requests in parallel without multi-threading issues. ## OrmLite's Async Tasks Builder OrmLite's `AsyncDbTasksBuilder` provides a similar benefit of making it effortless to run multiple async DB Requests in parallel, which looks like: ```csharp var results = await DbFactory.AsyncDbTasksBuilder() .Add(db => db.SelectAsync<Rockstar>()) .Add(db => db.SelectAsync<Album>()) .Add(db => db.SelectAsync<Department>()) .Add(db => db.SelectAsync<Employee>()) .RunAsync(); var (rockstars, albums, departments, employees) = results; ``` Which just like TypeScript's destructuring returns a positionally typed tuple of the results which can be destructured back into their typed variables, e.g: ```csharp (List<Rockstar> rockstars, List<Album> albums, List<Department> departments, List<Employee> employees) = results; ``` ### Supports up to 8 Tasks It allows chaining up to **8 async Tasks in parallel** as C#'s Type System doesn't allow for preserving different positional generic types in an unbounded collection. Instead each Task returns a new Generic Type builder which preserves the positional types before it. ### Supports both Async `Task<T>` and `Task` APIs Where `Task<T>` and `Task` APIs can be mixed and matched interchangeably: ```csharp var builder = DbFactory.AsyncDbTasksBuilder() .Add(db => db.InsertAsync(rockstars[0],rockstars[1])) .Add(db => db.SelectAsync<Rockstar>()) .Add(db => db.InsertAsync(albums[2],albums[3])) .Add(db => db.SelectAsync<Album>()) .Add(db => db.InsertAsync([department])) .Add(db => db.SelectAsync<Department>()) .Add(db => db.InsertAsync([employee])) .Add(db => db.SelectAsync<Employee>()); ``` Where to preserve the results chain, `Task` APIs return `bool` results, e.g: ```csharp (bool r1, List<Rockstar> r2, bool r3, List<RockstarAlbum> r4, bool r5, List<Department> r6, bool r7, List<Employee> r8) = await builder.RunAsync(); ``` ### Error Handling Whilst tasks are executed in parallel when added, any Exceptions are only thrown when the task is awaited: ```csharp using var Db = await OpenDbConnectionAsync(); var builder = DbFactory.AsyncDbTasksBuilder() .Add(db => db.InsertAsync(rockstars[0])) .Add(db => db.InsertAsync(rockstars[0])); // <-- Duplicate PK Exception // Exceptions are not thrown until the task is awaited try { var task = builder.RunAsync(); } catch (Exception e) { throw; } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/redis/async.md --- slug: async title: Redis Async APIs --- All Redis Client Managers implement both `IRedisClientsManager` and `IRedisClientsManagerAsync` so IOC registrations remain the same which can continue to register against the existing `IRedisClientsManager` interface, e.g: ```csharp container.Register<IRedisClientsManager>(c => new RedisManagerPool(redisConnectionString)); ``` Where it can be used to resolve both sync `IRedisClient` and async `IRedisClientAsync` clients, e.g: ```csharp using var syncRedis = container.Resolve<IRedisClientsManager>().GetClient(); await using var asyncRedis = await container.Resolve<IRedisClientsManager>().GetClientAsync(); ``` If you want to force async-only API usage could choose to just register `IRedisClientsManagerAsync` where it only lets you resolve async only `IRedisClientAsync` and `ICacheClientAsync` clients, e.g: ```csharp public void ConfigureServices(IServiceCollection services) { services.AddSingleton<IRedisClientsManagerAsync>(c => new RedisManagerPool()); } //... public class MyDep(IRedisClientsManagerAsync manager) { public async Task<long> Incr(string key, uint value) { await using var redis = await manager.GetClientAsync(); return await redis.IncrementAsync(key, value); } } ``` ## Usage in ServiceStack Inside ServiceStack Services & Controllers we recommend using `GetRedisAsync()` to resolve an `IRedisClientAsync`: ```csharp public class MyService : Service { public async Task<object> Any(MyRequest request) { await using var redis = await GetRedisAsync(); await redis.IncrementAsync(nameof(MyRequest), 1); } } public class HomeController : ServiceStackController { public async Task<ActionResult> Index() { await using var redis = await GetRedisAsync(); await redis.IncrementAsync(nameof(HomeController), 1); } } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/autoquery/audit-log.md --- title: AutoQuery CRUD Executable Audit Log --- In addition to being able to declaratively develop [AutoQuery](/autoquery/) and [CRUD](/autoquery/crud) APIs without needing to implement them, you're also able to enable a **recorded history of Executable Audit information** over all AutoCrud operations in an executable audit log that in addition to maintaining an automated recorded history of every change to an entity also exhibits "EventSourcing-like capabilities" in being able to recreate the entities state using the latest Services implementation by replaying all AutoCrud operations in order, which can be applied on a granular entity, table level, or in the unlikely case that all System DB writes are performed through AutoQuery CRUD Services, it's capable of re-creating the entire DB state from just its Audit history, although is dependent on whether all changes made to AutoCrud Services are backwards compatible. Being able to rebuild your Systems DB by replaying audit history events is a nice property that can serve as an integrity check to verify that all changes leading up to the current DB state has been recorded. As data is the most important part of most systems it can be beneficial to maintain a change history of when items were created, modified and deleted (and by whom) as we're used to when using a VCS for our source code. Typically this means also employing "non destructive" approaches to system design like "Soft Deletes" which you can declaratively implement with Auto CRUD. ## Executable Crud Audit Events This feature tries to obtain some of the nice features of Event Sourcing but without the additional complexity by allowing you to capture all CRUD operations in an executable log whilst still retaining your RDBMS as your master authority. This feature doesn’t require any additional dev overhead as your AutoCrud Request DTOs are the recorded events. To enable this feature you just need to register an [ICrudEvents](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Server/CrudEvents.cs) provider which will let you persist your events in any data store, but typically you’d use OrmLiteCrudEvents to persist it in the same RDBMS that the AutoCrud requests are already writing to, e.g: ```csharp public class ConfigureAutoQuery : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { // Enable Audit History services.AddSingleton<ICrudEvents>(c => new OrmLiteCrudEvents(c.GetRequiredService<IDbConnectionFactory>()) { // NamedConnections = { SystemDatabases.Reporting } }); services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, //IncludeTotal = true, }); }) .ConfigureAppHost(appHost => { appHost.Resolve<ICrudEvents>().InitSchema(); }); } ``` If you’re using Multitenancy features or multiple RDBMS’s in your AutoCrud DTOs you can add them to NamedConnections where it will create an CrudEvent table in each of the RDBMS’s used. and that’s all that’s required, now every AutoCrud operation will persist the Request DTO and associative metadata in the Event entry below within a DB transaction: ```csharp public class CrudEvent : IMeta { [AutoIncrement] public long Id { get; set; } // AutoCrudOperation, e.g. Create, Update, Patch, Delete, Save public string EventType { get; set; } public string Model { get; set; } // DB Model Name public string ModelId { get; set; } // Primary Key of DB Model public DateTime EventDate { get; set; } // UTC public long? RowsUpdated { get; set; } // How many rows were affected public string RequestType { get; set; } // Request DTO Type public string RequestBody { get; set; } // Serialized Request Body public string UserAuthId { get; set; } // UserAuthId if Authenticated public string UserAuthName { get; set; } // UserName or unique User Identity public string RemoteIp { get; set; } // Remote IP of the Request public string Urn { get; set; } // URN format: urn:{requestType}:{ModelId} // Custom Reference Data with or with non-integer Primary Key public int? RefId { get; set; } public string RefIdStr { get; set; } public Dictionary<string, string> Meta { get; set; } } ``` ## Full Executable Audit History With what's captured this will serve as an Audit History of state changes for any row by querying the `Model` & `ModelId` columns, e.g: ```csharp var dbEvents = (OrmLiteCrudEvents)container.Resolve<ICrudEvents>(); var rowAuditEvents = dbEvents.GetEvents(Db, nameof(Rockstar), id); ``` The contents of the Request DTO stored as JSON in `RequestBody`. You can quickly display the contents of any JSON in human-friendly HTML with the [htmlDump](https://sharpscript.net/docs/html-scripts#htmldump) script if you're using `#Script`, `@Html.HtmlDump(obj)` if you're using Razor or just the static `ViewUtils.HtmlDump(obj)` method to get a raw pretty-formatted HTML String. ## Replay AutoCrud Requests If all your database was created with AutoCrud Services you could delete its rows and re-create it by just re-playing all your AutoCrud DTOs in the order they were executed, which can be done with: ```csharp var eventsPlayer = new CrudEventsExecutor(appHost); foreach (var crudEvent in dbEvents.GetEvents(db)) { await eventsPlayer.ExecuteAsync(crudEvent); } ``` The `CrudEventsExecutor` uses your AppHost's `ServiceController` to execute the message, e,g. same execution pipeline MQ Requests use, so it will execute your AppHost's `GlobalMessageRequestFilters/Async` if you have any custom logic in Request Filters (e.g. Multi TenantId example above). It also executes authenticated AutoCrud requests as the original AutoCrud Request Authenticated User, which just like [JWT Refresh Tokens](/auth/jwt-authprovider#requires-user-auth-repository-or-iusersessionsource) will require either using an AuthRepository or if you're using a Custom Auth Provider you can implement an `IUserSessionSource` to load User Sessions from a custom data store. When replaying the Audit Events it will use the original primary key, even if you're using `[AutoIncrement]` Primary Keys, this will let you re-create the state of a single entry, e.g: ```csharp db.DeleteById<Rockstar>(id); var rowAuditEvents = dbEvents.GetEvents(Db, nameof(Rockstar), id); foreach (var crudEvent in rowAuditEvents) { await eventsPlayer.ExecuteAsync(crudEvent); } ``` If for instance you wanted it to execute through your latest logic with any enhancements or bug fixes, etc. ## Ignoring Crud Events You can selectively choose to ignore capturing events by returning `null` in the `EventsFilter` when registering `OrmLiteCrudEvents`, e.g: ```csharp new OrmLiteCrudEvents(c.Resolve<IDbConnectionFactory>()) { EventsFilter = (row,context) => MyShouldIgnore(context) ? null : row } ``` The `CrudContext` contains all the relevant information about the AutoQuery Crud request, including: ```csharp public class CrudContext { public IRequest Request { get; private set; } public IDbConnection Db { get; private set; } public ICrudEvents Events { get; private set; } public string Operation { get; set; } public object Dto { get; private set; } public Type ModelType { get; private set; } public Type RequestType { get; private set; } public Type ResponseType { get; private set; } public ModelDefinition ModelDef { get; private set; } public PropertyAccessor IdProp { get; private set; } public PropertyAccessor ResultProp { get; private set; } public PropertyAccessor CountProp { get; private set; } public PropertyAccessor RowVersionProp { get; private set; } public object Id { get; set; } public object Response { get; set; } public long? RowsUpdated { get; set; } } ``` Alternatively you can ignore recording the event for requests tagged with `IRequest.Items[Keywords.IgnoreEvent]`, e.g: ```csharp GlobalRequestFilters.Add((req, res, dto) => { if (MyShouldIgnore(dto)) req.Items[Keywords.IgnoreEvent] = bool.TrueString; }); ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/locode/auditing.md --- title: Auditing --- A benefit to AutoQuery's structured declarative approach to its CRUD APIs is that it's better able to enable high-level generic functionality that can benefit all CRUD APIs. AutoQuery CRUD's [Executable Audit Log](/autoquery/audit-log) is an example of this which makes use of [AutoQuery CRUD Attributes](/autoquery/crud#autoquery-crud-attributes) to capture every CRUD operation responsible for any modifications to its underlying RDBMS tables. We'll explore an overview of this feature by applying it to our simple Bookings table from the [AutoQuery CRUD Bookings Demo](/autoquery/crud-bookings) included in all [jamstacks.net](https://jamstacks.net/) project templates there by adding the ability to track all CRUD API operations and with it all modifications made to our `Booking` RDBMS table. ## Enabling Crud Events First thing we need to do is register the `ICrudEvents` dependency in our App's IOC, in this case uses `OrmLiteCrudEvents` to store all Audit Information in the `CrudEvent` table of our configured database: ```csharp public class ConfigureAutoQuery : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { // Enable Audit History services.AddSingleton<ICrudEvents>(c => new OrmLiteCrudEvents(c.Resolve<IDbConnectionFactory>())); services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, //IncludeTotal = true, }); }) .ConfigureAppHost(appHost => { // Create CrudEvent if it doesn't exist appHost.Resolve<ICrudEvents>().InitSchema(); }); } ``` As `ICrudEvents` stores all events in a separate table, we also need to use `InitSchema` above to create the `CrudEvent` table if it doesn't already exist. ## Enabling Audit History Tracking on Data Models and APIs With `ICrudEvents` registered, we can now choose which Data Models we want to enable Audit Tracking on by having them inherit from the built-in `AuditBase` class: ```csharp public class Booking : AuditBase { [AutoIncrement] public int Id { get; set; } public string Name { get; set; } public RoomType RoomType { get; set; } public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } public decimal Cost { get; set; } public string Notes { get; set; } public bool? Cancelled { get; set; } } ``` This will extend our RDBMS tables with additional Audit Info to capture who and when bookings were **Created**, **Last Modified** and **Deleted** (when using Soft Deletes): ```csharp public abstract class AuditBase { public DateTime CreatedDate { get; set; } public string CreatedBy { get; set; } public DateTime ModifiedDate { get; set; } public string ModifiedBy { get; set; } public DateTime? DeletedDate { get; set; } public string DeletedBy { get; set; } } ``` Even without needing to inspect the audit history table, capturing this info on its own provides valuable insight into the provenance of each booking. ### Configuring CRUD APIs But to be able to maintain a complete executable audit log we need to capture every CRUD API and modification done on our `Booking` table which we can do by annotating our Booking CRUD APIs with [AutoApply](/autoquery/crud#apply-generic-crud-behaviors) attribute which annotates our APIs with the behavior we want to apply to them. For the Audit feature, this behavior is implemented in the pre-registered [AuditAutoCrudMetadataFilter](/autoquery/crud#auditautocrudmetadatafilter) which dynamically adds attributes to annotated APIs to achieve it's desired behavior. By primarily using the [[AutoPopulate]](/autoquery/crud#autopopulate) attribute to populate the Audit Info and [[AutoFilter]](/autoquery/crud#autofilter) attribute to ensure our Query APIs don't return any **Soft Deleted** records. Essentially `[AutoApply]` enables us to define a single attribute as a substitute for defining the multiple compound attributes to populate the Audit Info and Filter queries. The appropriate `[AutoApply]` behavior needs to be added on all the CRUD APIs for the data models we want Audit History tracking enabled on, e.g: ```csharp [AutoApply(Behavior.AuditQuery)] public class QueryBookings : QueryDb<Booking> { //... } [AutoApply(Behavior.AuditCreate)] public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { //... } [AutoApply(Behavior.AuditModify)] public class UpdateBooking : IPatchDb<Booking>, IReturn<IdResponse> { //... } [AutoApply(Behavior.AuditSoftDelete)] public class DeleteBooking : IDeleteDb<Booking>, IReturnVoid { //... } ``` ::: tip Use `Behavior.AuditDelete` instead if you prefer your delete operations resulted in permanent hard deletes ::: ## Audit Info in Locode The `AutoQueryFeature.AccessRole` determines the accessibility of the CRUD Event APIs that Locode uses to display the Audit History logs for each entity, which by default is restricted to **Admin** users who will be able to view the Audit History of each record at the bottom of its **Edit** Form. With our Bookings CRUD APIs now configured with Audit behavior we can see an example of what this looks like in Locode after the Employee User account records a Booking from **John Smith** for a **Single** room: ![](/img/pages/locode/audit-history-create.png) With the left section displaying audit information about the CRUD operation and the User making it including their UserName, Id and IP. The right section contains the info sent in the Request DTO, in this case the `CreateBooking` API. If **John Smith** later contacts the manager to upgrade his booking to a **Suite**, the Audit information will be updated with the `UpdateBooking` Audit entry which as it is a `IPatchDb<Table>` operation, only contains information that's **changed**: ![](/img/pages/locode/audit-history-update.png) This is typically why the behavior of `IPatchDb<Table>` is preferable over `IUpdateDb<Table>` APIs when Audit Tracking is enabled as otherwise each Update operation would instead contain the entire entry on each update. ## Full Executable Audit History As each Audit Entry contains the CRUD Request DTO, they can be used to recreate the state of the database by replaying each event & executing them against a blank database: ```csharp var eventsPlayer = new CrudEventsExecutor(appHost); foreach (var crudEvent in dbEvents.GetEvents(db)) { await eventsPlayer.ExecuteAsync(crudEvent); } ``` This can provide similar benefits to [Event Sourcing](https://martinfowler.com/eaaDev/EventSourcing.html) without its additional development & maintenance burden where they can be used to reconstruct the state of the database at a specific point in time or used to populate external data stores like search indexes or analytical reporting databases to enable greater insight than what an RDBMS snapshot can provide. You could replay all events to reconstruct the entire state of the database, or choose just the tables you want to rebuild where you can fetch all Audit Events for just the `Booking` table with: ```csharp var bookingAuditEvents = dbEvents.GetEvents(Db, nameof(Booking)); ``` Alternatively you can fetch all the audit events for a single row which is what Locode uses to display its Audit Events: ```csharp var rowAuditEvents = dbEvents.GetEvents(Db, nameof(Booking), id); ``` ## Complete Bookings CRUD Implementation The [Bookings CRUD Demo](/autoquery/crud-bookings) is a good representative example of the effort it takes to implement a traditional CRUD API with AutoQuery: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="rSFiikDjGos" style="background-image: url('https://img.youtube.com/vi/rSFiikDjGos/maxresdefault.jpg')"></lite-youtube> Which reduces the development to effort of creating Full Stack Apps down to declaring your Data Models and APIs with simple POCOs to define the precise schema of the underlying RDBMS tables and API contract, that can then benefit from: - Productive end-to-end [typed Development Model in 11 languages](https://servicestack.net/service-reference) - [Declarative Dev Model](/locode/declarative) for defining Authentication, Validation, Documentation & UI Customizations - Beautiful, UX-Friendly, capability-based Customizable UI in [Locode](https://servicestack.net/locode) - Rich analysis, API discoverability & [simplified client integrations](/api-explorer#code-tab) in [API Explorer](/api-explorer) - Powerful querying capabilities in [AutoQuery](/autoquery/) - Full executable [Audit History Tracking](/autoquery/audit-log) - Access to ServiceStack's rich ecosystem of [typed clients, versatile formats & endpoints](/why-servicestack#multiple-clients) - Seamless integrations with [Open API](/openapi), interactive [Jupyter Notebooks](/jupyter-notebooks) & [Instant Client Apps](https://apps.servicestack.net/) And access to ServiceStack's rich ecosystem of features, most of which are centered around your typed API contracts making them easy to enhance & apply to your existing Services. All without needing to write a single line of implementation logic thanks to the default implementation in AutoQuery Services & Auto UIs in Locode, API Explorer & Swagger UI. At the same time when needed the default behavior can be overridden at multiple levels, from custom AutoQuery implementations on the server to custom UIs on the client. For completeness, the entire source code used to implement the Bookings CRUD implementation is below: ```csharp public class Booking : AuditBase { [AutoIncrement] public int Id { get; set; } public string Name { get; set; } public RoomType RoomType { get; set; } public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } public decimal Cost { get; set; } public string Notes { get; set; } public bool? Cancelled { get; set; } } public enum RoomType { Single, Double, Queen, Twin, Suite, } [AutoApply(Behavior.AuditQuery)] public class QueryBookings : QueryDb<Booking> { public int[] Ids { get; set; } } [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditCreate)] public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { public string Name { get; set; } public RoomType RoomType { get; set; } [ValidateGreaterThan(0)] public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [ValidateGreaterThan(0)] public decimal Cost { get; set; } public string Notes { get; set; } } [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditModify)] public class UpdateBooking : IPatchDb<Booking>, IReturn<IdResponse> { public int Id { get; set; } public string Name { get; set; } public RoomType? RoomType { get; set; } [ValidateGreaterThan(0)] public int? RoomNumber { get; set; } public DateTime? BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [ValidateGreaterThan(0)] public decimal? Cost { get; set; } public bool? Cancelled { get; set; } public string Notes { get; set; } } [ValidateHasRole("Manager")] [AutoApply(Behavior.AuditSoftDelete)] public class DeleteBooking : IDeleteDb<Booking>, IReturnVoid { public int Id { get; set; } } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auth/auth-repository.md --- slug: auth-repository title: Auth Repository --- ServiceStack Auth supports using your own persistence back-ends but for the most part you should be able to reuse one of the existing [IAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/IAuthRepository.cs): - **OrmLite**: `OrmLiteAuthRepository` in [ServiceStack.Server](https://nuget.org/packages/ServiceStack.Server) - [OrmLiteAuthRepositoryMultitenancy](/multitenancy#multitenancy-rdbms-authprovider) - **Redis**: `RedisAuthRepository` in [ServiceStack](https://nuget.org/packages/ServiceStack) - **Memory**: `InMemoryAuthRepository` in [ServiceStack](https://nuget.org/packages/ServiceStack) - **AWS DynamoDB**: `DynamoDbAuthRepository` in [ServiceStack.Aws](https://nuget.org/packages/ServiceStack.Aws) - **Mongo DB**: `MongoDBAuthRepository` in [ServiceStack.Authentication.MongoDB](https://nuget.org/packages/ServiceStack.Authentication.MongoDB) - **Raven DB**: `RavenUserAuthRepository` in [ServiceStack.Authentication.RavenDB](https://nuget.org/packages/ServiceStack.Authentication.RavenDB) - **Marten**: `MartenAuthRepository` in [ServiceStack.Authentication.Marten](https://www.nuget.org/packages/ServiceStack.Authentication.Marten) - [GitHub project](https://github.com/migajek/ServiceStack.Authentication.Marten) - **LiteDB**: `LiteDBAuthRepository` in [ServiceStack.Authentication.LiteDB](https://github.com/CaveBirdLabs/ServiceStack.Authentication.LiteDB) #### Registering an Auth Repository The `OrmLiteAuthRepository` is the most common Auth Repository which will let you persist User Info in any of the [RDBMS's that OrmLite supports](/ormlite/#ormlite-rdbms-providers). All Auth Repositories are registered by adding a `IAuthRepository` dependency in your IOC, e.g: ```csharp container.Register<IDbConnectionFactory>(c => new OrmLiteConnectionFactory(connectionString, SqlServer2012Dialect.Provider)); container.Register<IAuthRepository>(c => new OrmLiteAuthRepository(c.Resolve<IDbConnectionFactory>())); container.Resolve<IAuthRepository>().InitSchema(); ``` Calling `InitSchema()` will create the necessary RDBMS `UserAuth` and `UserAuthDetails` tables if they don't already exist. By default the Users Roles and Permissions are blobbed on the `UserAuth` table, but if preferred they can optionally be maintained in a separate `UserAuthRole` table with: ```csharp container.Register<IAuthRepository>(c => new OrmLiteAuthRepository(c.Resolve<IDbConnectionFactory>()) { UseDistinctRoleTables = true }); ``` Like the [caching providers](/caching) the `async` Auth Repositories makes use of this existing `IAuthRepository` registration which you can use in your Services to access either `IAuthRepositoryAsync` or `IAuthRepository` APIs above even for your own sync Auth Repos that only implement `IAuthRepository` as it will return a `IAuthRepositoryAsync` wrapper API in its place. ### Auth Repository Admin APIs If you're interested in implementing a User Management feature in your own Apps you may want to re-use the Admin APIs in the [User Admin Feature](/admin-ui-users) which enable Service access to many User Auth Repository features. ### Mix in Auth Repository The easiest way to configure a User Auth Repository in your [Modular Startup](/modular-startup) App that new ASP.NET Core templates support is to [mix them in](/mix-tool#composable-features), e.g. you can configure to use an OrmLiteAuthRepository using SQL Server with: :::sh npx add-in auth-db sqlserver ::: You can view other Auth Repository "mix ins" available with: :::sh npx add-in [auth] ::: Which displays the current list of available Auth Repositories: ``` Results matching tag [auth]: 1. auth Configure AuthFeature to: $HOST by @ServiceStack [auth] 2. auth-db Use OrmLite Auth Repository (requires auth) to: $HOST by @ServiceStack [auth] 3. auth-redis Use Redis Auth Repository (requires auth) to: $HOST by @ServiceStack [auth] 4. auth-memory Use Memory Auth Repository (requires auth) to: $HOST by @ServiceStack [auth] 5. auth-dynamodb Use DynamoDB Auth Repository (requires auth) to: $HOST by @ServiceStack [auth] 6. auth-mongodb Use MongoDB Auth Repository (requires auth) to: $HOST by @ServiceStack [auth] 7. auth-ravendb Use RavenDB Auth Repository (requires auth) to: $HOST by @ServiceStack [auth] 8. auth-marten Use Marten Auth Repository (requires auth) to: $HOST by @ServiceStack [auth] 9. feature-authrepo List and Search Users in an Auth Repo to: $HOST by @ServiceStack [feature,auth] ``` and search the available RDBMS's and NoSQL Data Stores: :::sh npx add-in [db] ::: That can be easily configured by a mix in: ``` Results matching tag [db]: 1. redis Use ServiceStack.Redis to: $HOST by @ServiceStack [db] 2. sqlserver Use OrmLite with SQL Server to: $HOST by @ServiceStack [db] 3. sqlite Use OrmLite with SQLite to: $HOST by @ServiceStack [db] 4. postgres Use OrmLite with PostgreSQL to: $HOST by @ServiceStack [db] 5. mysql Use OrmLite with MySql to: $HOST by @ServiceStack [db] 6. oracle Use OrmLite with Oracle to: $HOST by @ServiceStack [db] 7. firebird Use OrmLite with Firebird to: $HOST by @ServiceStack [db] 8. dynamodb Use AWS DynamoDB and PocoDynamo to: $HOST by @ServiceStack [db] 9. mongodb Use MongoDB to: $HOST by @ServiceStack [db] 10. ravendb Use RavenDB to: $HOST by @ServiceStack [db] 11. marten Use Marten NoSQL with PostgreSQL to: $HOST by @ServiceStack [db] ``` ### Sync & Async Auth Repositories All built-in ServiceStack Auth Repositories implement both `IUserAuthRepository` and `IUserAuthRepositoryAsync` which you can use inside ServiceStack Services with the `AuthRepositoryAsync` property, e.g: ```csharp //async public async Task<object> Post(GetUserAuth request) { var userAuth = await AuthRepositoryAsync.GetUserAuthByUserNameAsync(request.UserName); if (userAuth == null) throw HttpError.NotFound(request.UserName); return userAuth; } //sync public object Post(GetUserAuth request) { var userAuth = AuthRepository.GetUserAuthByUserName(request.UserName); if (userAuth == null) throw HttpError.NotFound(request.UserName); return userAuth; } ``` Outside of ServiceStack you can access it from the AppHost `GetAuthRepositoryAsync()` or `GetAuthRepository()` APIs, e.g: ```csharp //async var authRepo = HostContext.AppHost.GetAuthRepositoryAsync(); await using (authRepo as IAsyncDisposable) { //... } //sync var authRepo = HostContext.AppHost.GetAuthRepository(); using (authRepo as IDisposable) { //... } ``` ## Extending UserAuth tables There are a number of different extensibility options for extending ServiceStack Authentication by linking to external tables with its `RefId` and `RefIdStr` fields or storing custom info in the `Meta` Dictionaries. Most Auth Repositories like OrmLite also supports utilizing custom `UserAuth` tables with extended fields which can be configured using its generic Constructor, e.g: ```csharp public class MyUserAuth : UserAuth { .... } public class MyUserAuthDetails : UserAuthDetails { .... } ``` ```csharp container.Register<IAuthRepository>(c => new OrmLiteAuthRepository<MyUserAuth, MyUserAuthDetails>(c.Resolve<IDbConnectionFactory>()) { UseDistinctRoleTables = true }); ``` The [Auth Repository mix gists](/mix-tool#mix-in-auth-repository) are configured with an example using a custom `AppUser` table which are populated using the [Session and Auth Events](/auth/sessions#session-events) hooks, e.g: ```csharp // Custom User Table with extended Metadata properties public class AppUser : UserAuth { public string ProfileUrl { get; set; } public string LastLoginIp { get; set; } public DateTime? LastLoginDate { get; set; } } public class AppUserAuthEvents : AuthEvents { public override void OnAuthenticated(IRequest req, IAuthSession session, IServiceBase authService, IAuthTokens tokens, Dictionary<string, string> authInfo) { var authRepo = HostContext.AppHost.GetAuthRepository(req); using (authRepo as IDisposable) { var userAuth = (AppUser)authRepo.GetUserAuth(session.UserAuthId); userAuth.ProfileUrl = session.GetProfileUrl(); userAuth.LastLoginIp = req.UserHostAddress; userAuth.LastLoginDate = DateTime.UtcNow; authRepo.SaveUserAuth(userAuth); } } } ``` #### Custom UserAuth Tables If you want to add support for custom `UserAuth` and `UserAuthDetails` tables in your own custom Auth Repositories you'll need to implement the `ICustomUserAuth` interface by returning the concrete Type that should be used instead: ```csharp public interface ICustomUserAuth { IUserAuth CreateUserAuth(); IUserAuthDetails CreateUserAuthDetails(); } ``` If implementing a generic class like `OrmLiteAuthRepository<TUserAuth,TUserAuthDetails>` you can return new instances of the Generic Type Arguments with: ```csharp IUserAuth CreateUserAuth() => (IUserAuth)typeof(TUserAuth).CreateInstance(); IUserAuthDetails CreateUserAuthDetails() => (IUserAuthDetails)typeof(TUserAuthDetails).CreateInstance(); ``` ### Adding additional metadata to the Meta dictionary fields For minor extensions you can use the **Meta** string dictionaries fields on the UserAuth tables to maintain custom metadata. They include useful `Get<T>` and `Set<T>` methods which can be used to blob additional complex types with each User, e.g: ```csharp userAuth.Set(new Address { ... }); var address = userAuth.Get<Address>(); ``` ### Linking referential data with RefId and RefIdStr fields The `UserAuth` and `UserAuthDetails` tables also include an `int? RefId` and a `string RefIdStr` fields which you can use to reference external data like your own custom tables against each User Auth record or User OAuth registration. ### Extend UserAuthSession with your own typed Custom Session In addition to a Custom UserAuth tables you can also use a custom `AuthUserSession` for maintaining typed Users Sessions which get blobbed in a fast [Caching Provider](/caching) where its schema-less persistance characteristics, easily supports fast access to extended types. ```csharp public class CustomUserSession : AuthUserSession { ... } appHost.Plugins.Add(new AuthFeature( () => new CustomUserSession(), ...); ``` ### IAuthRepository APIs Inside your Services you can access the **async** `base.AuthRepositoryAsync` and **sync** `IAuthRepository` APIs with: ```csharp await base.AuthRepositoryAsync.CreateUserAuthAsync(...); ``` You can use the Async APIs with every Auth Repository as an async wrapper is returned for Auth Repositories which only support the Sync APIs. If you need to access the Auth Repository from inside a sync method you can access the **sync** APIs from `base.AuthRepository`, e.g: ```csharp base.AuthRepository.CreateUserAuth(...); ``` All ServiceStack's built-in Auth Repositories support the extended `IUserAuthRepository` APIs which your Services can use to manage your App's registered users: ```csharp public interface IUserAuthRepository : IAuthRepository { IUserAuth CreateUserAuth(IUserAuth newUser, string password); IUserAuth UpdateUserAuth(IUserAuth existingUser, IUserAuth newUser); IUserAuth UpdateUserAuth(IUserAuth existingUser, IUserAuth newUser, string password); IUserAuth GetUserAuth(string userAuthId); void DeleteUserAuth(string userAuthId); } public interface IUserAuthRepositoryAsync : IAuthRepositoryAsync { Task<IUserAuth> CreateUserAuthAsync(IUserAuth newUser, string password, CancellationToken token); Task<IUserAuth> UpdateUserAuthAsync(IUserAuth existingUser, IUserAuth newUser, CancellationToken token); Task<IUserAuth> UpdateUserAuthAsync(IUserAuth existingUser, IUserAuth newUser, string password); Task<IUserAuth> GetUserAuthAsync(string userAuthId, CancellationToken token); Task DeleteUserAuthAsync(string userAuthId, CancellationToken token); } public interface IAuthRepository { void LoadUserAuth(IAuthSession session, IAuthTokens tokens); void SaveUserAuth(IAuthSession authSession); List<IUserAuthDetails> GetUserAuthDetails(string userAuthId); IUserAuthDetails CreateOrMergeAuthSession(IAuthSession authSession, IAuthTokens tokens); IUserAuth GetUserAuth(IAuthSession authSession, IAuthTokens tokens); IUserAuth GetUserAuthByUserName(string userNameOrEmail); void SaveUserAuth(IUserAuth userAuth); bool TryAuthenticate(string userName, string password, out IUserAuth userAuth); bool TryAuthenticate(Dictionary<string, string> digestHeaders, string privateKey, int nonceTimeOut, string sequence, out IUserAuth userAuth); } public interface IAuthRepositoryAsync { Task LoadUserAuthAsync(IAuthSession session, IAuthTokens tokens, CancellationToken token); Task SaveUserAuthAsync(IAuthSession authSession, CancellationToken token); Task<List<IUserAuthDetails>> GetUserAuthDetailsAsync(string userAuthId, CancellationToken token); Task<IUserAuthDetails> CreateOrMergeAuthSessionAsync(IAuthSession authSession, IAuthTokens tokens); Task<IUserAuth> GetUserAuthAsync(IAuthSession authSession, IAuthTokens tokens, CancellationToken token); Task<IUserAuth> GetUserAuthByUserNameAsync(string userNameOrEmail, CancellationToken token); Task SaveUserAuthAsync(IUserAuth userAuth, CancellationToken token); Task<IUserAuth> TryAuthenticateAsync(string userName, string password, CancellationToken token); Task<IUserAuth> TryAuthenticateAsync(Dictionary<string, string> digestHeaders, string privateKey, int nonceTimeOut, string sequence, CancellationToken token); } ``` ### Updating UserAuth tables directly If you need finer-grained access than the shared APIs above, you can update the `UserAuth` and `UserAuthDetails` POCOs in your preferred persistence provider directly. E.g. if you're using the `OrmLiteAuthRepository` to store your Users in an RDBMS back-end you can use [OrmLite APIs](/ormlite/) to update the user details stored in the `UserAuth` and `UserAuthDetails` tables, e.g: ```csharp Db.UpdateOnly(() => new UserAuth { DisplayName = newName }, where: q => q.Id == userId); ``` Which will only update the `DisplayName` column for the specified user. If you're using a [Custom UserAuth Table](/auth/auth-repository#extending-userauth-tables) (e.g. `AppUser`) instead of the default `UserAuth` you would need to update that POCO data model instead. ### IManageRoles API The [IManageRoles API](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/IAuthRepository.cs#L72) can be implemented by any `IAuthRepository` to provide an alternative strategy for querying and managing Users Roles and permissions. This API is being used in the `OrmLiteAuthRepository` to provide an alternative way to store Roles and Permission in their own distinct table rather than being blobbed with the rest of the User Auth data. This behavior can be enabled in `OrmLiteAuthRepository` by specifying `UseDistinctRoleTables=true` at registration, e.g: ```csharp container.Register<IAuthRepository>(c => new OrmLiteAuthRepository(c.Resolve<IDbConnectionFactory>()) { UseDistinctRoleTables = true, }); ``` When enabled, roles and permissions are persisted in the distinct **UserAuthRole** table instead of being blobbed on the UserAuth. The `IAuthSession.HasRole()` and `IAuthSession.HasPermission()` on the Users Session should be used to check if a User has a specified Role or Permission. If you're persisting roles in a different table you'll need to use the `IManageRoles` APIs to access & manage a users role, e.g: ```csharp var manageRoles = (IManageRolesAsync)base.AuthRepositoryAsync; // async var manageRoles = (IManageRoles)base.AuthRepository; // sync ``` These APIs can be used with `OrmLiteAuthRepository` whether Roles are persisted in external tables or not. ```csharp public interface IManageRoles { ICollection<string> GetRoles(string userAuthId); ICollection<string> GetPermissions(string userAuthId); void GetRolesAndPermissions(string userAuthId, out ICollection<string> roles, out ICollection<string> permissions); bool HasRole(string userAuthId, string role); bool HasPermission(string userAuthId, string permission); void AssignRoles(string userAuthId, ICollection<string> roles = null, ICollection<string> permissions = null); void UnAssignRoles(string userAuthId, ICollection<string> roles = null, ICollection<string> permissions = null); } public interface IManageRolesAsync { Task<ICollection<string>> GetRolesAsync(string userAuthId, CancellationToken token); Task<ICollection<string>> GetPermissionsAsync(string userAuthId, CancellationToken token); Task<Tuple<ICollection<string>,ICollection<string>>> GetRolesAndPermissionsAsync( string userAuthId, CancellationToken token); Task<bool> HasRoleAsync(string userAuthId, string role, CancellationToken token); Task<bool> HasPermissionAsync(string userAuthId, string permission, CancellationToken token); Task AssignRolesAsync(string userAuthId, ICollection<string> roles = null, ICollection<string> permissions = null, CancellationToken token); Task UnAssignRolesAsync(string userAuthId, ICollection<string> roles = null, ICollection<string> permissions = null, CancellationToken token); } ``` More examples of this are in [ManageRolesTests.cs](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Common.Tests/ManageRolesTests.cs). ### Assigning Roles and Permissions Super Users with the **Admin** role or Requests with an [AdminAuthSecret](/debugging#authsecret) can call the built-in `/assignroles` and `/unassignroles` Services to add Roles/Permissions to existing users from an external Request, e.g: ```csharp var client = new JsonApiClient(baseUrl); var response = client.Post(new AssignRoles { UserName = userName, Roles = new List<string> { "TheRole" }, Permissions = new List<string> { "ThePermission" } }); ``` Inside ServiceStack you can use the `AssignRoles` API to add Roles and Permissions to an existing User: ```csharp var userAuth = await AuthRepositoryAsync.GetUserAuthByUserNameAsync(userName); if (userAuth == null) throw HttpError.NotFound(userName); await AuthRepositoryAsync.AssignRolesAsync(userAuth, new[]{ "TheRole" }, new[]{ "ThePermission" }); ``` Alternatively you can add Roles when creating a new User with: ```csharp await AssignRolesAsync.CreateUserAuthAsync(new UserAuth { UserName = userName, FirstName = "John", LastName = "Doe", DisplayName = "John Doe", Roles = new List<string> { "TheRole" } }, userPassword); ``` ### Customizing User Roles and Permissions The default implementation of User Roles and Permissions on [AuthUserSession](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/AuthUserSession.cs) shows how ServiceStack's `[RequiredRole]` and `[RequiredPermission]` [Roles and Permission attributes](/auth/authentication-and-authorization#the-requiredrole-and-requiredpermission-attributes) are validated: ```csharp public virtual bool HasPermission(string permission) { var managesRoles = HostContext.TryResolve<IAuthRepository>() as IManageRoles; if (managesRoles != null) { return managesRoles.HasPermission(this.UserAuthId, permission); } return this.Permissions != null && this.Permissions.Contains(permission); } public virtual bool HasRole(string role) { var managesRoles = HostContext.TryResolve<IAuthRepository>() as IManageRoles; if (managesRoles != null) { return managesRoles.HasRole(this.UserAuthId, role); } return this.Roles != null && this.Roles.Contains(role); } ``` These APIs are `virtual` so they can be overridden in both your Custom `AuthUserSession`. They default to looking at the `Roles` and `Permissions` collections stored on the Session. These collections are normally sourced from the `AuthUserRepository` when persisting the [UserAuth and UserAuthDetails POCO's](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/UserAuth.cs) and are used to populate the `UserAuthSession` on successful Authentication. These collections can be further customized by AuthProviders which is what `AspNetWindowsAuthProvider` does to add [Authenticated WindowsAuth Roles](https://github.com/ServiceStack/ServiceStack/blob/9773b7fccc31ac4d715a02221f396b46cd14d7db/src/ServiceStack/Auth/AspNetWindowsAuthProvider.cs#L126). As seen above Roles/Permissions can instead be managed by AuthProviders that implement `IManageRoles` API which is what OrmLiteAuthProvider uses to look at distinct RDBMS tables to validate Roles/Permissions. ### Microsoft Graph Roles [OAuth Providers](/auth#auth-providers) like Microsoft Graph have their own global roles for users managed separately. In order to combine both Microsoft Graph's Azure AD Roles with App-defined roles when using the `OrmLiteAuthRepository` it needs to be configured to persist roles in distinct role tables (required to capture the source of each role): ```csharp services.AddSingleton<IAuthRepository>(c => new OrmLiteAuthRepository<AppUser, UserAuthDetails>(c.Resolve<IDbConnectionFactory>()) { UseDistinctRoleTables = true }); ``` Once configured you'll be able to manage your App's local User roles via ServiceStack's [Auth Repository](/auth/auth-repository), [Assign Roles APIs](/auth/auth-repository#assigning-roles-and-permissions) or built-in [Admin Users UI](/admin-ui-users) without interfering with Azure AD managed roles: <div class="block flex justify-center items-center"> <a href="/admin-ui-users"><img class="max-w-screen-md" src="/img/pages/admin-ui/users-edit-default.png"></a> </div> ### PBKDF2 Password Hashing implementation ServiceStack uses the same [PBKDF2](https://en.wikipedia.org/wiki/PBKDF2) password hashing algorithm ASP.NET Identity v3 by default for both new users and successful authentication logins where their password will automatically be re-hashed with the new implementation. This also means if you wanted to switch, you'll be able to import ASP.NET Identity v3 User Accounts and their Password Hashes into ServiceStack.Auth's `UserAuth` tables and vice-versa. #### Retain previous Password Hashing implementation If preferred you can revert to using the existing `SaltedHash` implementation with: ```csharp SetConfig(new HostConfig { UseSaltedHash = true }); ``` This also supports "downgrading" passwords that were hashed with the new `IPasswordHasher` provider where it will revert to using the older/weaker `SaltedHash` implementation on successful authentication. #### Override Password Hashing Strength The new [PasswordHasher](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/PasswordHasher.cs) implementation can also be made to be computationally stronger or weaker by adjusting the iteration count (default 10000), e.g: ```csharp container.Register<IPasswordHasher>(new PasswordHasher(1000)); ``` #### Versionable Password Hashing The new [IPasswordHasher](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Auth/IPasswordHasher.cs) interface includes support for versioning future Password Hashing algorithms and rehashing: ```csharp public interface IPasswordHasher { // First byte marker used to specify the format used. The default implementation uses format: // { 0x01, prf (UInt32), iter count (UInt32), salt length (UInt32), salt, subkey } byte Version { get; } // Returns a boolean indicating whether the providedPassword matches the hashedPassword // The needsRehash out parameter indicates whether the password should be re-hashed. bool VerifyPassword(string hashedPassword, string providedPassword, out bool needsRehash); // Returns a hashed representation of the supplied password string HashPassword(string password); } ``` Which is implemented in all ServiceStack Auth Repositories where it will rehash passwords that used a different version or weaker strength, by utilizing the new API for verifying passwords: ```csharp if (userAuth.VerifyPassword(password, out var needsRehash)) { this.RecordSuccessfulLogin(userAuth, needsRehash, password); return true; } ``` If you're using a Custom Auth Repository it will need to use the new password verification APIs, please refer to [OrmLiteAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/bed1d900de93f889cca05299df4c33a04b7ad7a7/src/ServiceStack.Server/Auth/OrmLiteAuthRepository.cs#L325-L359) for a complete concrete example. #### Fallback PasswordHashers The list of `Config.FallbackPasswordHashers` can be used for migrating to a new Password Hashing algorithm by registering older Password Hashing implementations that were previously used to hash Users passwords. Failed password verifications will fallback to see if the password was hashed with any of the registered `FallbackPasswordHashers`, if any are valid, the password attempt will succeed and the password re-hashed with the registered `IPasswordHasher` implementation. ### Digest Auth Hashes only created when needed Digest Auth Hashes are only populated if the `DigestAuthProvider` is registered. If you ever intend to support Digest access authentication in future but don't want to register the DigestAuthProvider just yet, you can force ServiceStack to continue to maintain Digest Auth Hashes with: ```csharp new AuthFeature { CreateDigestAuthHashes = true } ``` Users that don't have Digest Auth Hashes will require logging in again in order to have it populated. If you don't intend to use Digest Auth you can clear the `DigestHa1Hash` column in your `UserAuth` table which is otherwise unused. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auth/authentication-and-authorization.md --- slug: authentication-and-authorization title: Authentication and Authorization --- Built into ServiceStack is a simple and extensible Authentication Model that implements standard HTTP Session Authentication where [Session Cookies](/auth/sessions) are used to send Authenticated Requests which reference Users Custom UserSession POCO's in your App's registered [Caching Provider](/caching). ServiceStack also includes a number of Auth Providers which "Authenticate per-request" in this case the Authenticated User Session is only attached to and lasts during the lifetime of the current `IRequest`. The implementation details of each Auth Provider are transparent to your Application where the same Attributes and APIs are used to retrieve, validate, authenticate and authorize Users. ServiceStack's Authentication support is encapsulated in the optional `AuthFeature` plugin which provides an easy way to declaratively register and configure multiple Auth Providers you want to allow in your Application. It's highly configurable with a number of additional features like whether to enable built-in Registration for Registering new Users as well as Assign/UnAssign Roles Services that Admins can use to assign Roles/Permissions to existing users. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="XKq7TkZAzeg" style="background-image: url('https://img.youtube.com/vi/XKq7TkZAzeg/maxresdefault.jpg')"></lite-youtube> <div class="not-prose mx-auto px-8"> <h3 id="identityauth-template" class="mb-4 text-4xl tracking-tight font-extrabold text-gray-900"> Create a Project with ServiceStack Auth </h3> <servicestack-auth-templates></servicestack-auth-templates> </div> ### Highly customizable and versatile ServiceStack's Authentication is also highly customizable and versatile from being able to choose from the plethora of Auth Providers available or inheriting from them to create your own customized Auth Provider, inheriting `AuthUserSession` to use your own Custom POCO with additional info you want to maintain for your Users, storing User Sessions in any of the available [Caching Providers](/caching), adding custom logic by handling any of the [Auth and Session Events](/auth/sessions#session-events) raised throughout the Auth lifecycle, to specifying which back-end [Auth Repository](/auth/auth-repository) you want to persist your Authenticated Users in - supporting most popular RDBMS's and popular NoSQL data stores as seen in the high-level overview below: ### High Level Overview ![Authentication Overview](/img/pages/security/auth-highlevel-overview.svg?sanitize=true) The `AuthenticateService` is the primary Service that manages Authentication which delegates to the specified Auth Provider that performs the Authentication, made available via its following endpoints: - `/auth/{provider}` - Authenticate against a specific Auth Provider - `/auth` - API to check if a Request is authenticated: returns **200** with basic session info if authenticated or **401** if not. - `/auth/logout` - Removes the Authenticated Session from the registered cache and clears Session Cookies. ### Credentials Auth Providers If you would like ServiceStack to manage your Apps entire Authentication and persistence of Users you would use one of the available Auth Repositories and authenticate against one of the following Auth Providers: | Provider | Class Name | Route | Description | |-|-|-|-| | **Credentials** | `CredentialsAuthProvider` | **/auth/credentials** | Standard Authentication using Username/Password | | **Basic Auth** | `BasicAuthProvider` | HTTP Basic Auth | Username/Password sent via [HTTP Basic Auth](https://en.wikipedia.org/wiki/Basic_access_authentication) | | **Digest Auth** | `DigestAuthProvider` | HTTP Digest Auth | Username/Password hash via [HTTP Digest Auth](https://en.wikipedia.org/wiki/Digest_access_authentication) | New Users can be created via the `/register` Registration Service which be enabled with: ```csharp Plugins.Add(new RegistrationFeature()); ``` ### OAuth Providers The following OAuth Providers are built into ServiceStack and can be used in both ASP.NET Core and .NET Framework Apps: | Provider | Class Name | Route | Create OAuth App Link | |-|-|-|-| | **Facebook** | `FacebookAuthProvider` | **/auth/facebook** | [developers.facebook.com/apps](https://developers.facebook.com/apps) | | **Twitter** | `TwitterAuthProvider` | **/auth/twitter** | [dev.twitter.com/apps](https://dev.twitter.com/apps) | | **Google** | `GoogleAuthProvider` | **/auth/google** | [console.developers.google.com](https://console.developers.google.com/apis/credentials) | | **GitHub** | `GithubAuthProvider` | **/auth/github** | [github.com/settings/applications/new](https://github.com/settings/applications/new) | | **Microsoft** | `MicrosoftGraphAuthProvider` | **/auth/microsoftgraph** | [apps.dev.microsoft.com](https://apps.dev.microsoft.com) | | **LinkedIn** | `LinkedInAuthProvider` | **/auth/linkedin** | [www.linkedin.com/secure/developer](https://www.linkedin.com/secure/developer) | | **Yammer** | `YammerAuthProvider` | **/auth/yammer** | [www.yammer.com/client_applications](http://www.yammer.com/client_applications) | | **Yandex** | `YandexAuthProvider` | **/auth/yandex** | [oauth.yandex.ru/client/new](https://oauth.yandex.ru/client/new) | | **VK** | `VkAuthProvider` | **/auth/vkcom** | [vk.com/editapp?act=create](http://vk.com/editapp?act=create) | | **Odnoklassniki** | `OdnoklassnikiAuthProvider` | **/auth/odnoklassniki** | [www.odnoklassniki.ru/devaccess](http://www.odnoklassniki.ru/devaccess) | <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="aQqF3Sf2fco" style="background-image: url('https://img.youtube.com/vi/aQqF3Sf2fco/maxresdefault.jpg')"></lite-youtube> ### Session Authentication Overview The diagram below outlines how standard session based Authentication works and how the different providers interact in more detail: ![Session Based Authentication](/img/pages/security/auth-session-auth.svg?sanitize=true) Where the **Auth Provider** are unique for each Auth Provider but otherwise adopt the same Authentication process that results in the same end result where an Authenticated `AuthUserSession` is persisted in the registered `ICacheClient` against the `ss-pid` Permanent Cookie if the `Authenticate` request `RememberMe=true` otherwise against `ss-id` Temporary Session Cookie if not. After a Request is Authenticated its Session Cookies are sent on subsequent requests and validated by ServiceStack's built in `[Authenticate]` and other `[Require*]` attributes to restrict access to valid users: ![Session Requests](/img/pages/security/auth-session-requests.svg?sanitize=true) Once authenticated the Users Session can be accessed in your **Services** using the Typed and minimal `IAuthSession` APIs: ```csharp AuthUserSession session = base.SessionAs<AuthUserSession>(); IAuthSession session = base.GetSession(); ``` Of if you've registered to use a Custom UserSession POCO in the `AuthFeature` constructor use that instead of `AuthUserSession`. Typed User Sessions also accessible in all Filters and handlers that have access to the current `IRequest` with: ```csharp AuthUserSession session = req.SessionAs<AuthUserSession>(); IAuthSession session = req.GetSession(); ``` See the [Session docs](/auth/sessions) for more info about customizing Sessions and handling different Session and Auth events. ### Authentication per Request Auth Providers These Auth Providers include authentication with each request so the Authenticated User Session is only populated on the HTTP `IRequest` and not saved in the registered Cache Client. Unlike traditional Auth Providers above where there is a separate "Authentication" request to establish authentication, Auth Providers that implement `IAuthWithRequest` instead send their Authentication "per-request" where it's only populated on the current `IRequest`: ![Auth with Request Auth Providers](/img/pages/security/auth-auth-with-request-providers.svg?sanitize=true) Whilst the Authentication Process is different you'd continue to use the same APIs and Attributes to access and validate the Users Session. The following Auth Providers below implement `IAuthWithRequest` and Authenticate per-request: | Provider | Class Name | Auth Method | Description | |-|-|-|-| | **JWT** | `JwtAuthProvider` | Bearer Token | Stateless Auth Provider using [JSON Web Tokens](/auth/jwt-authprovider) | | **API Keys** | `ApiKeyAuthProvider` | Bearer Token | Allow 3rd Parties access to [authenticate without a password](/auth/api-key-authprovider) | | **Basic Auth** | `BasicAuthProvider` | Basic Auth | Authentication using [HTTP Basic Auth](https://en.wikipedia.org/wiki/Basic_access_authentication) | | **Digest Auth** | `DigestAuthProvider` | Digest Auth | Authentication using [HTTP Digest Auth](https://en.wikipedia.org/wiki/Digest_access_authentication) | <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="NTCUT7atoLo" style="background-image: url('https://img.youtube.com/vi/NTCUT7atoLo/maxresdefault.jpg')"></lite-youtube> Some other special Auth Providers that Authenticate per-request include: - **Windows Auth** in `AspNetWindowsAuthProvider` - Authentication using [Windows Auth](https://support.microsoft.com/en-us/help/323176/how-to-implement-windows-authentication-and-authorization-in-asp-net) built into ASP.NET. - **Claims Auth** in `NetCoreIdentityAuthProvider` - Pass through Auth Provider that delegates to ASP.NET Core Identity Auth or Identity Server. ### Integrated ASP.NET Core Authentication The `NetCoreIdentityAuthProvider` is a bi-directional Authentication adapter that enables ServiceStack to use the same Authentication as the rest of your ASP.NET Core and MVC Application where it enables the following popular scenarios: - [Using ServiceStack Auth in MVC](/auth/identity-servicestack) - Use ServiceStack Auth to power ASP.NET Identity Auth, pre-configured in the [mvcauth](https://github.com/LegacyTemplates/mvcauth) project template. ### Community Auth Providers - [Azure Active Directory](https://github.com/jfoshee/ServiceStack.Authentication.Aad) - Allow Custom App to login with Azure Active Directory - [Azure Active Directory via Azure Graph for ServiceStack](https://github.com/ticky74/ServiceStack.Authentication.Azure) ### Basic Configuration A minimal configuration needed to get Basic Authentication up and running is the following in `AppHost.Config()` (derived from the [AuthTests unit test](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.Endpoints.Tests/AuthTests.cs)): ```csharp public override void Configure(Container container) { Plugins.Add(new AuthFeature(() => new AuthUserSession(), new IAuthProvider[] { new BasicAuthProvider(), //Sign-in with HTTP Basic Auth new CredentialsAuthProvider(), //HTML Form post of UserName/Password credentials })); container.Register<ICacheClient>(new MemoryCacheClient()); var userRepo = new InMemoryAuthRepository(); container.Register<IAuthRepository>(userRepo); //The IAuthRepository is used to store the user credentials etc. //Implement this interface to adjust it to your app's data storage } ``` [AuthWebTests](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.AuthWeb.Tests/) is a simple project that shows all Auth Providers configured and working in the same app. See the [AppHost](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.AuthWeb.Tests/AppHost.cs) for an example of the code and the [Web.config](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.AuthWeb.Tests/Web.config) for an example of the configuration required to enable each Auth Provider. ### OAuth Configuration Once you have the `ConsumerKey` and `ConsumerSecret` you need to configure it with your ServiceStack host, via [Web.config](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.AuthWeb.Tests/Web.config), e.g: ```xml <add key="oauth.RedirectUrl" value="https://yourhostname.com"/> <add key="oauth.CallbackUrl" value="https://yourhostname.com/auth/{0}"/> <add key="oauth.twitter.ConsumerKey" value="3H1FHjGbA1N0n0aT5yApA"/> <add key="oauth.twitter.ConsumerSecret" value="MLrZ0ujK6DwyjlRk2YLp6HwSdoBjtuqwXeHDQLv0Q"/> ``` For [.NET Core](/web-new) or [ASP.NET Core Apps](/templates/corefx) you can add the same keys to your `appsettings.json`, e.g: ```json { "oauth.RedirectUrl": "https://yourhostname.com", "oauth.CallbackUrl": "https://yourhostname.com/auth/{0}", "oauth.twitter.ConsumerKey": "3H1FHjGbA1N0n0aT5yApA", "oauth.twitter.ConsumerSecret": "MLrZ0ujK6DwyjlRk2YLp6HwSdoBjtuqwXeHDQLv0Q", } ``` Each OAuth Config option fallbacks to the configuration without the provider name. If needed you provide OAuth specific configuration by including the Auth Provider Name in the configuration, e.g: ```xml <add key="oauth.twitter.RedirectUrl" value="https://yourhostname.com"/> <add key="oauth.twitter.CallbackUrl" value="https://yourhostname.com/auth/twitter"/> ``` Configuration can also be specified in code when registering the Auth Provider in the `AuthFeature` plugin in your AppHost, e.g: ```csharp Plugins.Add(new AuthFeature(() => new AuthUserSession(), new IAuthProvider[] { new TwitterAuthProvider(appSettings) { RedirectUrl = "http://yourhostname.com/", CallbackUrl = "http://yourhostname.com/auth/twitter", ConsumerKey = "3H1FHjGbA1N0n0aT5yApA", ConsumerSecret = "MLrZ0ujK6DwyjlRk2YLp6HwSdoBjtuqwXeHDQLv0Q", }, })); ``` ::: info The Callback URL in each Application should match the CallbackUrl for your application which is typically: http://yourhostname.com/auth/{Provider}, e.g. http://yourhostname.com/auth/twitter for Twitter. ::: ### Allowing External Redirects External Redirects used in the `?continue` params of `/auth` requests are disabled by default, they can be re-enabled with: ```csharp new AuthFeature(...) { ValidateRedirectLinks = AuthFeature.AllowAllRedirects } ``` ### Auth Repository ServiceStack supports managing Users in multiple data stores via its [Auth Repository](/auth/auth-repository) abstraction and built-in providers. ### Session Persistence Once authenticated the **AuthUserSession** model is populated and stored in the Cache using one of ServiceStack's [supported Caching providers](/caching). ServiceStack's Sessions simply uses the [ICacheClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Caching/ICacheClient.cs) API so any new provider added can be used for both Session and Caching, which currently includes: - **Memory**: `MemoryCacheClient` in [ServiceStack](https://nuget.org/packages/ServiceStack) - **Redis**: `RedisClient`, `PooledRedisClientManager` or `BasicRedisClientManager` in [ServiceStack.Redis](https://nuget.org/packages/ServiceStack.Redis) - **OrmLite**: `OrmLiteCacheClient` in [ServiceStack.Server](https://nuget.org/packages/ServiceStack.Server) - **AWS DynamoDB**: `DynamoDbCacheClient` in [ServiceStack.Aws](https://nuget.org/packages/ServiceStack.Aws) - **Memcached**: `MemcachedClientCache` in [ServiceStack.Caching.Memcached](https://nuget.org/packages/ServiceStack.Caching.Memcached) - **Azure**: `AzureTableCacheClient` in [ServiceStack.Azure](https://nuget.org/packages/ServiceStack.Azure) The Auth Feature also allows you to specify your own custom `IUserAuthSession` type where you can capture additional metadata with your users session which will also get persisted and hydrated from the cache, e.g: ```csharp Plugins.Add(new AuthFeature(() => new CustomUserSession(), ... )); ``` ::: info If you're using Custom Sessions and have `JsConfig.ExcludeTypeInfo=true`, you need to [explicitly enable it](http://stackoverflow.com/q/18842685/85785) with `JsConfig<TCustomSession>.IncludeTypeInfo=true`. ::: After authentication the client will receive a cookie with a session id, which is used to fetch the correct session from the `ICacheClient` internally by ServiceStack. Thus, you can access the current session in a service: ```csharp public class SecuredService : Service { public object Get(Secured request) { var session = this.SessionAs<AuthUserSession>(); return new SecuredResponse() { Test = "You're" + session.FirstName }; } } ``` ServiceStack's Authentication, Caching and Session providers are completely new, clean, dependency-free testable APIs that doesn't rely on and is devoid of ASP.NET's existing membership, caching or session provider models. ### AuthSecret Admin Session Super User Requests using [Config.AdminAuthSecret](/debugging#authsecret) return an Authenticated Admin UserSession whose default values can be modified at `AuthFeature.AuthSecretSession`: - `DisplayName`: Admin - `UserName`: authsecret - `AuthProvider`: authsecret - `Roles`: Admin - `UserAuthId`: 0 ### Embedded Login Page `AuthFeature` adds a fallback **/login.html** page if the `HtmlRedirect` remains unchanged and no `/login.html` exists, otherwise if using a custom `/login` page in either **Razor** or **Script Pages** they'll continue to be used instead. The default `/login.html` page provides an auto Login page that supports authentication via Credentials as well as a generating a dynamic list of OAuth providers, e.g the [NorthwindCrud](https://github.com/NetCoreApps/NorthwindCrud) `/login` page with Facebook OAuth looks like: ![](/img/pages/release-notes/v5.9/auth-login.png) If you're using an SPA App with client side routing to implement `/login`, the default login page can be disabled with: ```csharp new AuthFeature { IncludeDefaultLogin = false } ``` The login page supports same `continue` or `ReturnUrl` redirect params as [Logout API](#logout). ## World Validation See the annotated [World Validation Docs](/world-validation) for a detailed walks through and showcases the implementation of how the most popular **Server HTML rendered** approaches and **Client UI rendered** technologies use the same built-in Authentication, Registration and protected Services. ## Project Templates Most of [ServiceStack's Project Templates](/dotnet-new) are configured with Auth out-of-the-box or can be easily added to an empty [web](https://github.com/NetCoreTemplates/web) project template: :::sh npx create-net web ProjectName ::: By [mixing in your desired auth](/mix-tool#mix-in-auth-repository) features, e.g. to configure your App to enable auth & maintain in SQL Server run: :::sh npx add-in auth auth-db sqlserver ::: Checkout the [Bookings CRUD YouTube demo](https://youtu.be/XpHAaCTV7jE) for a quick preview of this in action. ## Live Demos To illustrate Authentication integration with ServiceStack, see the authentication-enabled [Live Demos](https://github.com/NetCoreApps/LiveDemos) below: ### .NET Core - [Apple Sign In](https://github.com/NetCoreApps/AppleSignIn) - Apple Auth - [Bookings CRUD](https://github.com/NetCoreApps/BookingsCrud) - Credentials, Facebook, Google, Microsoft Auth - [New TechStacks](https://github.com/NetCoreApps/TechStacks) - GitHub, Twitter and JWT Auth - [SimpleAuth.Mvc](https://github.com/NetCoreApps/SimpleAuth.Mvc) - Twitter, Facebook, GitHub, VK, Yandex and Credentials Auth - [Chat](https://github.com/NetCoreApps/Chat) - Twitter, Facebook and GitHub Auth ### Mobile - [Android Java Chat](https://github.com/ServiceStackApps/AndroidJavaChat) - Facebook, Twitter and Google Auth - [Android Xamarin Chat](https://github.com/ServiceStackApps/AndroidXamarinChat) - Twitter Auth ### .NET Framework - [HttpBenchmarks Application](https://github.com/ServiceStackApps/HttpBenchmarks) - [Step-by-Step Authentication Guide](https://github.com/ServiceStackApps/HttpBenchmarks#authentication) - Twitter, Facebook, Google, LinkedIn and Credentials Auth - [Angular TechStacks](https://github.com/ServiceStackApps/TechStacks) - Twitter, GitHub and JWT Auth - [Gistlyn](https://github.com/ServiceStack/Gistlyn) - GitHub and JWT Auth - [AWS Auth](https://github.com/ServiceStackApps/AwsApps) - Twitter, Facebook, GitHub, Google, Yahoo, LinkedIn, and Credentials Auth - [MVC and WebForms Example](/servicestack-integration) - Twitter, Facebook, GitHub, Google, Yahoo, LinkedIn, VK, Credentials and Windows Auth - [Chat](https://github.com/ServiceStackApps/LiveDemos#chat) - Twitter, Facebook and GitHub Auth - [React Chat](https://github.com/ServiceStackApps/ReactChat) - Twitter, Facebook and GitHub Auth - [SocialBootstrap Api](https://github.com/ServiceStackApps/LiveDemos#social-bootstrap-api) - Twitter, Facebook, Yahoo and Credentials Auth ## Custom authentication and authorization A good starting place to create your own Auth provider that relies on username/password validation is to subclass `CredentialsAuthProvider` and override the `bool TryAuthenticate(service, username, password)` method where you can provide your custom implementation. If you instead wanted to authenticate via HTTP Basic Auth you would subclass `BasicAuthProvider` instead. Both the default [BasicAuthProvider](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/BasicAuthProvider.cs) and [CredentialsAuthProvider](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/CredentialsAuthProvider.cs) (which it extends) can be extended, and their behavior overwritten. An example is below: #### Async Custom AuthProvider ```csharp using ServiceStack; using ServiceStack.Auth; // From v5.10+ public class CustomCredentialsAuthProvider : CredentialsAuthProvider { public override async Task<bool> TryAuthenticateAsync(IServiceBase authService, string userName, string password, CancellationToken token=default) { //Add here your custom auth logic (database calls etc) //Return true if credentials are valid, otherwise false } public override async Task<IHttpResult> OnAuthenticatedAsync(IServiceBase authService, IAuthSession session, IAuthTokens tokens, Dictionary<string, string> authInfo, CancellationToken token=default) { //Fill IAuthSession with data you want to retrieve in the app eg: session.FirstName = "some_firstname_from_db"; //... //Call base method to Save Session and fire Auth/Session callbacks: return await base.OnAuthenticatedAsync(authService, session, tokens, authInfo, token); //Alternatively avoid built-in behavior and explicitly save session with //session.IsAuthenticated = true; //await authService.SaveSessionAsync(session, SessionExpiry, token); //authService.Request.Items[Keywords.DidAuthenticate] = true; //return null; } } ``` #### Sync Custom AuthProvider ```csharp using ServiceStack; using ServiceStack.Auth; public class CustomCredentialsAuthProvider : CredentialsAuthProviderSync { public override bool TryAuthenticate(IServiceBase authService, string userName, string password) { //Add here your custom auth logic (database calls etc) //Return true if credentials are valid, otherwise false } public override IHttpResult OnAuthenticated(IServiceBase authService, IAuthSession session, IAuthTokens tokens, Dictionary<string, string> authInfo) { //Fill IAuthSession with data you want to retrieve in the app eg: session.FirstName = "some_firstname_from_db"; //... //Call base method to Save Session and fire Auth/Session callbacks: return base.OnAuthenticated(authService, session, tokens, authInfo); //Alternatively avoid built-in behavior and explicitly save session with //session.IsAuthenticated = true; //authService.SaveSession(session, SessionExpiry); //authService.Request.Items[Keywords.DidAuthenticate] = true; //return null; } } ``` Then you need to register your custom credentials auth provider: ```csharp //Register all Authentication methods you want to enable for this web app. Plugins.Add(new AuthFeature(() => new AuthUserSession(), new IAuthProvider[] { new CustomCredentialsAuthProvider(), //HTML Form post of User/Pass } )); ``` By default the AuthFeature plugin automatically registers the following (overridable) Service Routes: ```csharp new AuthFeature = { ServiceRoutes = new Dictionary<Type, string[]> { { typeof(AuthenticateService), new[]{ "/auth", "/auth/{provider}" }}, { typeof(AssignRolesService), new[]{ "/assignroles" }}, { typeof(UnAssignRolesService), new[]{ "/unassignroles" }}, } }; ``` ### Logout You can do a GET or POST to `/auth/logout` to logout the authenticated user or if you're using C# client you can logout with: ```csharp client.Post(new Authenticate { provider = "logout" }); ``` #### Redirect URL Logging out will remove the Users Session from the Server and Session Cookies from the Client and redirect to the url in `continue`, `ReturnUrl` or configured `AuthFeature.HtmlRedirectReturnParam` **QueryString** or **FormData** Request param. If no redirect is specified it will fallback to redirect to `session.ReferrerUrl`, `Referer` HTTP Header or configured `AuthProvider.CallbackUrl`. ### Authenticating with .NET Service Clients On the client you can use the [C#/.NET Service Clients](/csharp-client) to easily consume your authenticated Services. To authenticate using your `CustomCredentialsAuthProvider` by POST'ing a `Authenticate` Request, e.g: ```csharp var client = new JsonApiClient(BaseUrl); var authResponse = client.Post(new Authenticate { provider = CredentialsAuthProvider.Name, //= credentials UserName = "test@gmail.com", Password = "p@55w0rd", RememberMe = true, }); ``` If authentication was successful the Service Client `client` instance will be populated with authenticated session cookies which then allows calling Authenticated services, e.g: ```csharp var response = client.Get(new GetActiveUserId()); ``` If you've also registered the `BasicAuthProvider` it will enable your Services to accept [HTTP Basic Authentication](https://en.wikipedia.org/wiki/Basic_access_authentication) which is built-in the Service Clients that you can populate on the Service Client with: ```csharp client.UserName = "test@gmail.com"; client.Password = "p@55w0rd"; ``` Which will also let you access protected Services, e.g: ```csharp var response = client.Get(new GetActiveUserId()); ``` Although behind-the-scenes it ends up making 2 requests, 1st request sends a normal request which will get rejected with a `401 Unauthorized` and if the Server indicates it has the `BasicAuthProvider` enabled it will resend the request with the HTTP Basic Auth credentials. You could instead save the latency of the additional auth challenge request by specifying the client should always send the Basic Auth with every request: ```csharp client.AlwaysSendBasicAuthHeader = true; ``` ### Authenticating with HTTP To Authenticate with your `CustomCredentialsAuthProvider` (which inherits from CredentialsAuthProvider) you would POST: **POST** localhost:60339/auth/credentials?format=json ```json { "UserName": "admin", "Password": "test", "RememberMe": true } ``` When the client now tries to authenticate with the request above and the auth succeeded, the client will retrieve some cookies with a session id which identify the client on each Web Service call. ### Authentication via OAuth AccessTokens To improve OAuth Sign In integration from native Mobile or Desktop Apps you can also Authenticate via AccessTokens which can dramatically simplify the Development and User Experience by being able to leverage the Native Facebook, Twitter and Google Client SDK's to Sign In users locally then reuse their local **AccessToken** to Authenticate with back-end ServiceStack Servers. Example usage of this feature is in the [Integrated Facebook, Twitter and Google Logins](https://github.com/ServiceStackApps/AndroidJavaChat/#integrated-facebook-twitter-and-google-logins) in Android Java Chat which is also able to [Automatically Sign In users with saved AccessTokens](https://github.com/ServiceStackApps/AndroidJavaChat#automatically-sign-in-previously-signed-in-users). This capability is available on the popular OAuth Providers below: - `FacebookAuthProvider` - Sign in with Facebook - `TwitterAuthProvider` - Sign in with Twitter - `GithubAuthProvider` - Sign in with Github - `GoogleOAuth2Provider` - Sign in with Google It can also be enabled in other OAuth2 Providers by implementing `VerifyAccessToken` to manually validate whether the provided AccessToken is valid with the registered OAuth App. The API to validate Access Tokens isn't part of the OAuth2 specification and is different (and often missing) for other OAuth2 providers. As an example, the `GoogleOAuth2Provider` uses a `VerifyAccessToken` implementation that's similar to: ```csharp new GoogleOAuth2Provider { VerifyAccessToken = accessToken => { var url = $"https://www.googleapis.com/oauth2/v1/tokeninfo?access_token={accessToken}"; var json = url.GetJsonFromUrl(); var obj = JsonObject.Parse(json); return obj["issued_to"] == ConsumerKey; } } ``` #### Client Authentication with AccessToken Clients can utilize this feature with the new `AccessToken` and `AccessTokenSecret` properties on the existing `Authenticate` Request DTO, sent with the **provider** that the AccessToken is for, e.g: ```csharp var response = client.Post(new Authenticate { provider = "facebook", AccessToken = facebookAccessToken, RememberMe = true, }); ``` ::: info Most OAuth Providers only require sending an `AccessToken` with Twitter being the exception which also requires sending an `AccessTokenSecret` ::: ### User Sessions Cache ServiceStack uses the [Cache Provider](/caching) which was registered in the IoC container: ```csharp //Register to use an In Memory Cache Provider (default) container.Register<ICacheClient>(new MemoryCacheClient()); //Configure an alt. distributed persisted cache, E.g Redis: //container.Register<IRedisClientsManager>(c => // new RedisManagerPool("localhost:6379")); ``` ::: info Tip If you've got multiple servers which run the same ServiceStack service, you can use Redis to share the sessions between these servers ::: Please look at [SocialBootstrapApi](https://github.com/ServiceStack/SocialBootstrapApi/tree/master/src/SocialBootstrapApi) to get a full example. ::: info Of course you can also implement your own - custom - authentication mechanism. You aren't forced to use the built-in ServiceStack auth mechanism ::: ## Declarative Validation Attributes The recommended way to protect your APIs is to use the [Declarative Validation](/declarative-validation) attributes which as they're decoupled from any implementation can be safely annotated on Request DTOs without adding any implementation dependencies. In addition by annotating Authorization and Validation attributes on Request DTOs captures this information into your APIs reusable DTOs, filtering this information down to clients where they can provide enriched User Experiences. ### Authorization Attributes The available Typed Authorization Attributes include: | Attribute | Description | |--------------------------------|-------------------------------------------------------------------------| | `[ValidateIsAuthenticated]` | Protect access to this API to Authenticated Users only | | `[ValidateIsAdmin]` | Protect access to this API to Admin Users only | | `[ValidateHasPermission]` | Protect access to this API to only Users assigned with ALL Permissions | | `[ValidateHasRole]` | Protect access to this API to only Users assigned with ALL Roles | Where they can be annotated on **Request DTOs** to protect APIs: ```csharp [ValidateIsAuthenticated] // or [ValidateRequest("IsAuthenticated")] [ValidateIsAdmin] // or [ValidateRequest("IsAdmin")] [ValidateHasRole(role)] // or [ValidateRequest($"HasRole(`{role}`)")] [ValidateHasPermission(permission)] // or [ValidateRequest($"HasPermission(`{permission}`)") public class Secured {} ``` ## The Authenticate attribute The `[Authenticate]` [Request Filter Attribute](/filter-attributes) tells ServiceStack which Services needs authentication by adding it to your Service implementations, e.g: ```csharp [Authenticate] public class SecuredService : Service { public object Get(Secured request) { IAuthSession session = this.GetSession(); return new SecuredResponse() { Test = "You're" + session.FirstName }; } public object Put(Secured request) { return new SecuredResponse() { Test = "Valid!" }; } public object Post(Secured request) { return new SecuredResponse() { Test = "Valid!" }; } public object Delete(Secured request) { return new SecuredResponse() { Test = "Valid!" }; } } ``` If you want, that authentication is only required for GET and PUT requests for example, you have to provide some extra parameters to the `Authenticate` attribute. ```csharp [Authenticate(ApplyTo.Get | ApplyTo.Put)] ``` ## RequiredRole and RequiredPermission attributes ServiceStack also includes a built-in role & permission based authorization attributes where you can apply the `[Required*]` Request Filter Attributes on your Service classes to apply to all Services or limited to a single Service: ```csharp [Authenticate] //All HTTP (GET, POST...) methods need "CanAccess" [RequiredRole("Admin")] [RequiredPermission("CanAccess")] public class MyServices : Service { public object Get(Secured request) {} [RequiredPermission("CanAdd")] public object Put(Secured request) {} [RequiredPermission("CanAdd")] public object Post(Secured request) {} [RequiredPermission("AdminRights", "CanDelete")] public object Delete(Secured request) {} } ``` Now the client needs the permissions: - **CanAccess** to make a GET request - **CanAccess**, **CanAdd** to make a PUT/POST request - **CanAccess**, **AdminRights** and **CanDelete** to make a DELETE request If instead you want to allow access to users in **ANY** Role or Permission use: ```csharp [RequiresAnyRole("Admin","Member")] [RequiresAnyRole(ApplyTo.Put | ApplyTo.Post, "Admin","Owner","Member")] [RequiresAnyPermission(ApplyTo.Delete, "AdminRights", "CanDelete")] public class MyServices : Service { public object Get(Secured request) {} public object Put(Secured request) {} public object Post(Secured request) {} public object Delete(Secured request) {} } ``` These attributes can also be applied to Request DTOs however as they would add a dependency to **ServiceStack.dll**, it's recommended to ## Enabling Authentication at different levels ### Using the [Authenticate] Attribute You can protect services by adding the `[Authenticate]` attribute on either the Action: ```csharp class MyService : Service { [Authenticate] public object Get(Protected request) { ... } } ``` The Request DTO ```csharp [Authenticate] class Protected { ... } ``` Or the service implementation ```csharp [Authenticate] class MyService : Service { public object Get(Protected request) { ... } } ``` Or by inheriting from a base class ```csharp [Authenticate] class MyServiceBase : Service { ... } class MyService : MyServiceBase { public object Get(Protected request) { ... } } ``` ### Using a Global Request Filter Otherwise you can use a [global Request Filter](/request-and-response-filters) if you wanted to restrict all requests any other way, e.g something like: ```csharp GlobalRequestFiltersAsync.Add(async (req, res, requestDto) => { if (ShouldProtectRequest(requestDto)) { await new AuthenticateAttribute().ExecuteAsync(req, res, requestDto); } }); ``` ## Customizing AuthProviders #### CustomValidationFilter The `CustomValidationFilter` on all AuthProviders lets you add post verification logic after a user has signed in with an OAuth provider and their OAuth metadata is retrieved. The filter lets you return a `IHttpResult` to control what error response is returned, e.g: ```csharp new FacebookAuthProvider(appSettings) { CustomValidationFilter = authCtx => CustomIsValid(authCtx) ? authCtx.Service.Redirect(authCtx.Session.ReferrerUrl .AddHashParam("f","CustomErrorCode")) : null, } ``` Or could be used to redirect a network or users to a "Not Available in your Area" page with: ```csharp Plugins.Add(new AuthFeature(..., new IAuthProvider[] { new CredentialsAuthProvider { CustomValidationFilter = authCtx => authCtx.Request.UserHostAddress.StartsWith("175.45.17") ? HttpResult.Redirect("http://host.com/are-not-available") : null } })); ``` #### UserName Validation The UserName validation for all Auth Repositories are configurable at: ```csharp Plugins.Add(new AuthFeature(...){ ValidUserNameRegEx = new Regex(@"^(?=.{3,20}$)([A-Za-z0-9][._-]?)*$", RegexOptions.Compiled), }) ``` Instead of RegEx you can choose to validate using a Custom Predicate. The example below ensures UserNames don't include specific chars: ```csharp Plugins.Add(new AuthFeature(...){ IsValidUsernameFn = userName => userName.IndexOfAny(new[] { '@', '.', ' ' }) == -1 }) ``` #### AccountLocked Validator Use `AccountLockedValidator` to override logic to determine when an account is locked, e.g. by default an Account is Locked when it has a `LockedDate` but can be changed to allow locking accounts at a future date with: ```csharp new CredentialsAuthProvider { AccountLockedValidator = (authRepo, userAuth, tokens) => userAuth.LockedDate != null && userAuth.LockedDate <= DateTime.UtcNow; } ``` Alternatively if you're using a Custom Auth Provider you can just override `IsAccountLocked()` to override this behavior. #### Saving Extended OAuth Metadata The new `SaveExtendedUserInfo` property (enabled by default) on all OAuth providers let you control whether to save the extended OAuth metadata available (into `UserAuthDetails.Items`) when logging in via OAuth. #### MaxLoginAttempts The `MaxLoginAttempts` feature lets you lock a User Account after multiple invalid login attempts, e.g: ```csharp Plugins.Add(new AuthFeature(...) { MaxLoginAttempts = 5 // Lock user after 5 Invalid attempts }); ``` ### Adding AuthProviders with Plugins Plugins can register AuthProviders by calling `RegisterAuthProvider()` before the `AuthFeature` plugin is registered, which can be achieved in Plugins by having them implement `IPreInitPlugin`: ```csharp public class MyPlugin : IPreInitPlugin { public void BeforePluginsLoaded(IAppHost appHost) { appHost.GetPlugin<AuthFeature>().RegisterAuthProvider(new MyAuthProvider()); } } ``` ### Auth Response Filter Auth Providers can customize the `AuthenticateResponse` returned by implementing `IAuthResponseFilter` where it will get called back with a populated [AuthFilterContext](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/IAuthProvider.cs) for successful Authenticate Request DTO requests or `AuthResultContext` for successful OAuth requests: ```csharp public interface IAuthResponseFilter { // Intercept successful Authenticate Request DTO requests void Execute(AuthFilterContext authContext); // Intercept successful OAuth redirect requests Task ResultFilterAsync(AuthResultContext authContext, CancellationToken token=default); } public class AuthFilterContext { public AuthenticateService AuthService // Instance of AuthenticateService public IAuthProvider AuthProvider // Selected Auth Provider for Request public IAuthSession Session // Authenticated Users Session public Authenticate AuthRequest // Auth Request DTO public AuthenticateResponse AuthResponse // Auth Response DTO public string ReferrerUrl // Optimal Session Referrer URL to use redirects public bool AlreadyAuthenticated // If User was already authenticated public bool DidAuthenticate // If User Authenticated in this request } public class AuthResultContext { public IHttpResult Result // Response returned for this successful Auth Request public IServiceBase Service // Instance of Service used in this Request public IRequest Request // Current HTTP Request Context public IAuthSession Session // Authenticated Users Session } ``` The filters can be used to modify properties on the `AuthenticateResponse` DTO or OAuth successful redirect requests. To completely replace the `AuthenticateResponse` returned, you can specify a `AuthFeature.AuthResponseDecorator`. ### ICustomUserAuth The `ICustomUserAuth` interface can be implemented on User Auth Repositories that allow replacing the custom `UserAuth` and `UserAuthDetails` tables by returning the concrete Type that should be used instead: ```csharp public interface ICustomUserAuth { IUserAuth CreateUserAuth(); IUserAuthDetails CreateUserAuthDetails(); } ``` This allows using the same `RegistrationFeature` and `RegisterService` to handle registering new users with the substituted `IUserAuth` and `IUserAuthDetails` Types. #### LoadUserAuthFilter The LoadUserAuthFilter on `AspNetWindowsAuthProvider` lets you retrieve more detailed information about Windows Authenticated users during Windows Auth Authentication by using the .NET's ActiveDirectory services, e.g: ```csharp //... new AspNetWindowsAuthProvider(this) { LoadUserAuthFilter = LoadUserAuthInfo } //... public void LoadUserAuthInfo(AuthUserSession userSession, IAuthTokens tokens, Dictionary<string, string> authInfo) { if (userSession == null) return; using (PrincipalContext pc = new PrincipalContext(ContextType.Domain)) { var user = UserPrincipal.FindByIdentity(pc, userSession.UserAuthName); tokens.DisplayName = user.DisplayName; tokens.Email = user.EmailAddress; tokens.FirstName = user.GivenName; tokens.LastName = user.Surname; tokens.FullName = (String.IsNullOrWhiteSpace(user.MiddleName)) ? $"{user.GivenName} {user.Surname}" : $"{user.GivenName} {user.MiddleName} {user.Surname}"; tokens.PhoneNumber = user.VoiceTelephoneNumber; } } ``` ### Customizable PopulateUserRoles on AspNetWindowsAuthProvider The `AspNetWindowsAuthProvider` uses the public `IPrincipal.IsInRole()` API to determine if a User is in a particular Windows Auth role, however this can be slow when needing to query a large number of roles in LDAP as it would need to make an LDAP lookup for each role. Performance of this can now be improved by specifying a custom `PopulateUserRoles` implementation that overrides how User Roles are resolved, e.g: ```csharp new AspNetWindowsAuthProvider (AppSettings) { PopulateUserRoles = (request, user, session) => { using (WindowsIdentity userId = request?.LogonUserIdentity) { List roles = new List(); if (userId?.Groups != null) { foreach (var group in userId.Groups) { // Remove the domain name from the name of the group, // if it has it, and you don't need it. var groupName = new SecurityIdentifier(group.Value) .Translate(typeof(NTAccount)).ToString(); if (groupName.Contains("\")) groupName = groupName.Split('\')[1]; roles.Add(groupName); } } session.Roles = roles; } } } ``` ### In Process Authenticated Requests You can enable the `CredentialsAuthProvider` to allow **In Process** requests to Authenticate without a Password with: ```csharp new CredentialsAuthProvider { SkipPasswordVerificationForInProcessRequests = true, } ``` When enabled this lets **In Process** Service Requests to login as a specified user without needing to provide their password. For example this could be used to create an [Intranet Restricted](/auth/restricting-services) **Admin-Only** Service that lets you login as another user so you can debug their account without knowing their password with: ```csharp [RequiredRole("Admin")] [Restrict(InternalOnly=true)] public class ImpersonateUser { public string UserName { get; set; } } public class MyAdminServices : Service { public async Task<object> Any(ImpersonateUser request) { using var service = base.ResolveService<AuthenticateService>(); //In Process return await service.PostAsync(new Authenticate { provider = AuthenticateService.CredentialsProvider, UserName = request.UserName, }); } } ``` ::: info Your Services can use the new `Request.IsInProcessRequest()` to identify Services that were executed in-process ::: ### Custom User Sessions using JWT Tokens The [JWT Auth Provider](/auth/jwt-authprovider) allows for a more flexible approach to impersonating users as they allow [Manually creating JWT Tokens](/auth/jwt-authprovider#creating-jwt-tokens-manually) to construct a custom User Session with Custom metadata, Roles and Permissions. ### IAuthMetadataProvider An IAuthMetadataProvider provides a way to customize the authInfo in all AuthProviders. It also allows overriding of how extended Auth metadata like profileUrl is returned. ```csharp public interface IAuthMetadataProvider { void AddMetadata(IAuthTokens tokens, Dictionary<string,string> authInfo); string GetProfileUrl(IAuthSession authSession, string defaultUrl = null); } ``` ::: info To override with a custom implementation, register `IAuthMetadataProvider` in the IOC ::: ### Generate New Session Cookies on Authentication The AuthFeature also regenerates new Session Cookies each time users login, this behavior can be disabled with: ```csharp Plugins.Add(new AuthFeature(...) { GenerateNewSessionCookiesOnAuthentication = false }); ``` ### ClientId and ClientSecret OAuth Config Aliases OAuth Providers can use `ClientId` and `ClientSecret` aliases instead of `ConsumerKey` and `ConsumerSecret`, e.g: ```xml <appSettings> <add key="oauth.twitter.ClientId" value="..." /> <add key="oauth.twitter.ClientSecret" value="..." /> </appSettings> ``` ### Override Authorization HTTP Header Request Filters can override the Authorization HTTP Header used in Auth Providers with: ```csharp httpReq.Items[Keywords.Authorization] = $"Bearer {token}"; ``` ### GET Authenticate Requests are disabled by default **GET** `/auth/{provider}` requests are disabled by default to discourage sending confidential information in the URL. The current exceptions which still allow **GET** requests include: - `/auth` - Used to check if a User is Authenticated - `/auth/logout` - Logging Out - All OAuth Providers who starts their OAuth flow by navigating to `/auth/{provider}` You can allow **GET** Authentication requests with: ```csharp new AuthFeature { AllowGetAuthenticateRequests = req => true } ``` Although it's recommended to change your code to use `POST` instead of `GET` requests. Otherwise you can use the `IRequest req` parameter to check against a white list of known requests types. <a name="community"></a> # Community Resources - [Simple Web Service Authentication with ServiceStack](https://steveellwoodnlc.medium.com/simple-web-service-authentication-with-servicestack-7294fe5493a2) by [@steveellwood](https://steveellwoodnlc.medium.com) - [Adding Facebook Authentication using ServiceStack](http://buildclassifieds.com/2016/01/14/facebookauth/) by [@markholdt](https://twitter.com/markholdt) - [How to return JSV formatted collection types from SQL Server in OrmLite](http://blog.falafel.com/Blogs/adam-anderson/2013/10/28/how-to-return-jsv-formatted-collection-types-from-sql-server-to-servicestack.ormlite) by [AdamAnderson](http://blog.falafel.com/blogs/AdamAnderson) - [How to migrate ASP.NET Membership users to ServiceStack](http://blog.falafel.com/Blogs/adam-anderson/2013/10/23/how-to-migrate-asp.net-membership-users-to-servicestack) by [AdamAnderson](http://blog.falafel.com/blogs/AdamAnderson) - [Authentication in ServiceStack REST Services](http://www.binaryforge-software.com/wpblog/?p=242) by [@binaryforge](https://twitter.com/binaryforge) - [Building a ServiceStack OAuth2 resource server using DotNetOpenAuth](http://dylanbeattie.blogspot.com/2013/08/building-servicestack-based-oauth2.html) by [@dylanbeattie](https://twitter.com/dylanbeattie) - [Declarative authorization in REST services in SharePoint with F#](http://sergeytihon.wordpress.com/2013/06/28/declarative-authorization-in-rest-services-in-sharepoint-with-f-and-servicestack/) by [@sergey_tihon](https://twitter.com/sergey_tihon) - [Authenticate ServiceStack services against an Umbraco membership provider](http://stackoverflow.com/a/16845317/85785) by [Gavin Faux](http://stackoverflow.com/users/1664508/gavin-faux) - [Using OAuth with ArcGIS Online and ServiceStack](http://davetimmins.com/post/2013/april/oauth-with-arcgisonline-servicestack) by [@davetimmins](https://twitter.com/davetimmins) - [LinkedIn Provider for ServiceStack Authentication](http://www.binoot.com/2013/03/30/linkedin-provider-for-servicestack-authentication/) by [@binu_thayamkery](https://twitter.com/binu_thayamkery) - [A Step by Step guide to create a Custom IAuthProvider](http://enehana.nohea.com/general/customizing-iauthprovider-for-servicestack-net-step-by-step/) by [@rngoodness](https://twitter.com/rngoodness) - [Simple API Key Authentication With ServiceStack](http://rossipedia.com/blog/2013/03/simple-api-key-authentication-with-servicestack/) by [@rossipedia](https://twitter.com/rossipedia) - [CORS BasicAuth on ServiceStack with custom authentication](http://joeriks.com/2013/01/12/cors-basicauth-on-servicestack-with-custom-authentication/) by [@joeriks](https://twitter.com/joeriks) - [Authenticating ServiceStack REST API using HMAC](https://www.jokecamp.com/blog/authenticating-servicestack-rest-api-using-hmac/) by [@jokecamp](https://twitter.com/jokecamp) - ServiceStack Credentials Authentication and EasyHttp: [Part 1](http://blogs.lessthandot.com/index.php/DesktopDev/MSTech/servicestack-credentialsauthentication-and-easyhtpp-of), [Part 2](http://blogs.lessthandot.com/index.php/DesktopDev/MSTech/servicestack-credentialsauthentication-and-easyhtpp-of-1), [Part 3](http://blogs.lessthandot.com/index.php/DesktopDev/MSTech/servicestack-credentialsauthentication-and-easyhtpp-of-2) by [@chrissie1](https://twitter.com/chrissie1) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auto-batched-requests.md --- slug: auto-batched-requests title: Auto Batched Requests --- One of the best ways to improve performance, efficiency and reduce latency is to minimize the number of network requests required, which is one of the reasons we've always encouraged [Coarse-grained API designs](/why-servicestack#servicestack-encourages-development-of-message-style-re-usable-and-batch-full-web-services) - which also lend themselves to better encapsulation and re-use. A common use-case that can be improved are clients making multiple requests to the same API, but due to the lack of a better alternative batched API or control over the server implementation, will default to making multiple N+1 web service requests. ### Pre-defined Routes For [Endpoint Routing](/endpoint-routing), the pre-defined route for Auto Batched Requests is: <div class="not-prose"> <h3 class="text-4xl text-center text-indigo-800 pb-3">/api/{Request}[]</h3> </div> For [Leegacy Routing](/routing), the pre-defined route for Auto Batched Requests is: <div class="not-prose"> <h3 class="text-4xl text-center text-indigo-800 pb-3">/json/reply/{Request}[]</h3> </div> ## All Services support Batching Thanks to it's [message-based design](/advantages-of-message-based-web-services), ServiceStack is able to enable high-level generic functionality like Request Batching which is now implicitly available for all Services, without any additional effort - where multiple requests of the same type can be sent together in a single HTTP Request. This is enabled in all [.NET Service Clients](/csharp-client) via the new `SendAll()` and `SendAllOneWay()` API's, e.g: ```csharp var client = new JsonApiClient(BaseUrl); var requests = new[] { new Request { Id = 1, Name = "Foo" }, new Request { Id = 2, Name = "Bar" }, new Request { Id = 3, Name = "Baz" }, }; List<Response> responses = client.SendAll(requests); ``` The API works as you would expect where multiple requests can be sent together and the Service Client will return a list of all responses in the same order as the requests were sent. ServiceStack also adds the `X-AutoBatch-Completed` HTTP Response Header containing the **number** of Requests that were executed. E.g. if one of the Requests threw an Exception it will contain the number of requests that were processed before the Exception was thrown, which short-circuits processing the remaining Auto Batched requests and returns a populated [structured Error Response](/error-handling) of the Exception. And on the back-end, your Services are none the wiser, remaining focused on handling a single Request DTO. In the case below the Service does some work then stores the response in Redis before returning it: ```csharp public class MyServices : Service { public object Any(Request request) { var response = DoWork(request); Redis.Store(response); return response; } } ``` ## Request Execution Flow From the Service's point of view nothing changes. Request DTO's still get executed one at a time, through all existing filters just as if they we're sent on their own. They're just delivered together within a single HTTP Request, in this case POST'ed as JSON to the `/json/reply/Request[]` [pre-defined route](/routing#pre-defined-routes): ![Auto Batched Requests](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/release-notes/auto-batched-requests.png) ## Custom Batched Requests Implementations If a client was previously calling the same API 100 times, the existing overhead of 100 HTTP Requests would be reduced to just **1 HTTP Request** when batched. Although the above Service would still be calling Redis 100 times to store each Response. If later this API has become really hot and you want to improve it even further, you can later add a custom implementation that accepts a `Request[]` and it will only get called once, with access to all the Request DTO's together. In this case we can use a custom implementation and take advantage of Redis's own batched API's and reduce this further to 1 Redis operation: ```csharp public class MyServices : Service { public object Any(Request request) { var response = DoWork(request); Redis.Store(response); return response; } public object Any(Request[] requests) { var responses = requests.Map(DoWork); Redis.StoreAll(responses); return responses; } } ``` So with this custom implementation we've gone from **100 HTTP Requests + 100 Redis Operations** to **1 HTTP Request + 1 Redis Operation**. Another scenario where you may consider using a **Custom Batched Implementation** is if you wanted to execute all requests within a single RDBMS transaction, which with [OrmLite](/ormlite/) would look something like: ```csharp public class MyServices : Service { public object Any(Request request) { var response = DoWork(request); Db.Insert(request); return response; } public object Any(Request[] requests) { using (var trans = Db.OpenTransaction()) { var responses = requests.Map(x => Any(x)); trans.Commit(); return responses; } } } ``` Just like with normal Batched Requests, Custom Batched implementations are still executed one at a time through all request/response filters, taking advantage of any existing logic/validation. ## Defining a Request DTO to accept a collection of Types If you instead only wanted multiple Requests to be treated as a single Request through the entire pipeline you can create a new Request DTO that inherits from `List<TRequest>` which then gets treated as a normal Request DTO e, g: ```csharp public class Requests : List<Request> {} public class MyServices : Service { ... public object Any(Requests requests) { var responses = requests.Map(DoWork); Redis.StoreAll(responses); return responses; } } ``` More examples of Auto Batched Requests and its behavior can be found in the [ReplyAllTests suite](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.Endpoints.Tests/ReplyAllTests.cs). ## Auto Batch Index The current index of the Auto Batched Request being processed is now being maintained in `IRequest.Items[Keywords.AutoBatchIndex]`. In Error Responses the index of the request that failed is now being populated in your Response DTO's `ResponseStatus.Meta["AutoBatchIndex"]`. To also maintain the active `AutoBatchIndex` in [Custom Batched Requests Implementations](#custom-batched-requests-implementations) you can use the `IRequest.EachRequest()` extension method, e.g: ```csharp public object Any(GetCustomAutoBatchIndex[] requests) { var responses = new List<GetAutoBatchIndexResponse>(); Request.EachRequest<GetCustomAutoBatchIndex>(dto => { responses.Add(Any(dto)); }); return responses; } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auto-html-api.md --- title: Auto HTML API Page --- The Auto HTML Page provides instant utility for API consumers in consuming your APIs with a built-in API Response Visualizer, JSON syntax highlighting, integrated Postman-like UI and API SDK integration all-in-one. <div class="not-prose hide-title my-16 px-4 sm:px-6"> <div class="text-center"> <h3 id="autohtml" class="text-4xl sm:text-5xl md:text-6xl tracking-tight font-extrabold text-gray-900"> Auto HTML API </h3> </div> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> The best way to visualize, inspect and integrate with your APIs in an instant!</p> <div class="my-8"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="3gjisRVqhLo" style="background-image: url('https://img.youtube.com/vi/3gjisRVqhLo/maxresdefault.jpg')"></lite-youtube> </div> </div> Lets take a quick tour through each of these features: ## PREVIEW Calling our APIs in a browser will greet us with the Preview page which uses the [HtmlFormat](/vue/formats#htmlformat) to display the API response in a beautiful Tailwind style with links to different [Content-Type Formats](/formats) and direct links to view it in [API Explorer](/api-explorer) and [Locode](/locode/) for [AutoQuery](/autoquery/rdbms) APIs: <a href="https://northwind.netcore.io/customers/ALFKI" class="not-prose max-w-4xl"> <div class="block flex justify-center shadow hover:shadow-lg rounded"> <img class="" src="/img/pages/ui/autohtml-preview.png"> </div> </a> ## JSON Developers who wish to view the actual JSON API Response can click on the **JSON** tab to view the JSON in pretty-formatted syntax-highlighted form with a 1-click button to copy: <a href="https://northwind.netcore.io/customers/ALFKI?tab=json" class="not-prose max-w-4xl"> <div class="block flex justify-center shadow hover:shadow-lg rounded"> <img class="" src="/img/pages/ui/autohtml-json.png"> </div> </a> ## FORM You don't need to build UIs before non-developers can access your APIs with the **FORM** tab which uses the new [AutoForm](/vue/autoform) component to render an optimal UI to call an API that you can further customize from your C# Request DTOs: <a href="https://blazor-gallery.servicestack.net/bookings/1?tab=form" class="not-prose max-w-4xl"> <div class="block flex justify-center shadow hover:shadow-lg rounded"> <img class="" src="/img/pages/ui/autohtml-form-bookings.png"> </div> </a> ## CODE The **CODE** tab gives you an appealing API docs page you can immediately share with any 3rd Party Developers that want to consume your APIs, with simple step-by-step instructions for how to call your APIs from their preferred programming language: <a href="https://northwind.netcore.io/customers/ALFKI?tab=code" class="not-prose max-w-4xl"> <div class="block flex justify-center shadow hover:shadow-lg rounded"> <img class="" src="/img/pages/ui/autohtml-code.png"> </div> </a> A nice benefit of ServiceStack's API Design is that consuming APIs are fundamentally all done the same way in all languages, which just requires adding a dependency containing a generic ServiceClient which can be used to call any ServiceStack API using the typed DTOs copied directly from the API docs page to enable an end-to-end typed API without any external tooling or build steps. ## Overriding Auto HTML API Like most of ServiceStack's built-in UIs, the Auto HTML API can be customized the same way by providing a local [HtmlFormat.html](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack/Templates/HtmlFormat.html) at the same path in your AppHost Project's `/wwwroot/Templates` folder: ```files /wwwroot/Templates HtmlFormat.html ``` ## API Fallback HTML Page The Auto HTML API is the fallback HTML page returned for APIs when calling user-defined routes from a browser (i.e. **Accept: text/html**): ### [/bookings/1](https://blazor-vue.web-templates.io/bookings/1) When calling the [/api pre-defined route](/routing#json-api-pre-defined-route) with the `.html` extension: ### [/api/QueryBookings.html?Id=1](https://blazor-vue.web-templates.io/api/QueryBookings.html?Id=1) When calling the [/api pre-defined route](/routing#json-api-pre-defined-route) with `?format=html`: ### [/api/QueryBookings?Id=1&format=html](https://blazor-vue.web-templates.io/api/QueryBookings?Id=1&format=html) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auto-mapping.md --- slug: auto-mapping title: Auto Mapping --- ## Using ServiceStack's Built-in Auto-mapping Although [we encourage keeping separate DTO models](http://stackoverflow.com/a/15369736/85785), you don't need to maintain your own manual mapping as you can use ServiceStack's built-in Auto Mapping support. It's quite comprehensive and resilient and does a good job in being able to co-erce one type into another, e.g. you can convert between different Enum types with the same name, between Enums and any value type and Strings, between properties and fields, POCOs and strings and many things in between - some of which can be seen in these [Auto Mapping tests](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Text/tests/ServiceStack.Text.Tests/AutoMappingTests.cs). Here are some typical common use-cases you're likely to hit in your web service development travels: Create a new DTO instance, populated with matching properties on viewModel: ```csharp var dto = viewModel.ConvertTo<MyDto>(); var dto = new { Anon = "Object" }.ConvertTo<MyDto>(); ``` Initialize DTO and populate it with matching properties on a view model: ```csharp var dto = new MyDto { A = 1, B = 2 }.PopulateWith(viewModel); ``` Initialize DTO and populate it with **non-default** matching properties on a view model: ```csharp var dto = new MyDto { A = 1, B = 2 }.PopulateWithNonDefaultValues(viewModel); ``` Initialize DTO and populate it with matching properties that are annotated with the **Attr** Attribute on a view model: ```csharp var dto = new MyDto { A=1 } .PopulateFromPropertiesWithAttribute(viewModel, typeof(CopyAttribute)); ``` There is also the inverse for mapping all properties that don't include a specific attribute: ```csharp var safeUpdate = db.SingleById<MyTable>(id) .PopulateFromPropertiesWithoutAttribute(dto, typeof(ReadOnlyAttribute)); ``` ### Advanced mapping using Converters You can register a custom Converter mapping using the `AutoMapping.RegisterConverter()` APIs, e.g: ```csharp // Data.User -> DTO User AutoMapping.RegisterConverter((Data.User from) => { var to = from.ConvertTo<User>(skipConverters:true); // avoid infinite recursion to.FirstName = from.GivenName; to.LastName = from.Surname; return to; }); // Car -> String AutoMapping.RegisterConverter((Car from) => $"{from.Model} ({from.Year})"); // WrappedDate -> DateTime AutoMapping.RegisterConverter((WrappedDate from) => from.ToDateTime()); // DateTime -> WrappedDate AutoMapping.RegisterConverter((DateTime from) => new WrappedDate(from)); ``` Where it will be called whenever a conversion between `Data.User -> User` or `Car -> String` is needed, inc. nested types and collections. Converters can also be used when you want to "take over" and override the default conversion behavior. ### Intercept AutoMapping Conversions The `RegisterPopulator` AutoMapping API can be used to run custom logic after an Auto Mapping Conversion, e.g. after a `T.ConvertTo<T>()` or `T.PopulateWith(obj)` is performed. This is useful when you need to intercept Auto Mapping conversions in external libraries, e.g. you can use this to populate the UserSession's `UserAuthId` with a different field from your Custom UserAuth: ```csharp AutoMapping.RegisterPopulator((IAuthSession session, IUserAuth userAuth) => { if (userAuth is RavenUserAuth ravenUserAuth) { session.UserAuthId = ravenUserAuth.Key; } }); ``` ### Advanced mapping using custom extension methods When mapping logic becomes more complicated we like to use extension methods to keep code DRY and maintain the mapping in one place that's easily consumable from within your application, e.g: ```csharp public static class ConvertExtensions { public static MyDto ToDto(this MyViewModel from) { var to = from.ConvertTo<MyDto>(); to.Items = from.Items.ConvertAll(x => x.ToDto()); to.CalculatedProperty = Calculate(from.Seed); return to; } } ``` Which is now easily consumable with just: ```csharp var dto = viewModel.ToDto(); ``` Using C# methods ensures conversion is explicit, discoverable, debuggable, fast and flexible with access to the full C# language at your disposal whose conversion logic can be further DRY'ed behind reusable extension methods. If you find you need to call this extension method manually in many places you may want to consider registering a Custom Converter instead. ### Ignore Mapping Use the `AutoMapping.IgnoreMapping()` API to specify mappings you want to skip entirely, e.g: ```csharp // Ignore Data.User -> User AutoMapping.IgnoreMapping<Data.User, User>(); // Ignore List<Data.User> -> List<User> AutoMapping.IgnoreMapping<List<Data.User>, List<User>>(); ``` ### Support for Implicit / Explicit Type Casts The built-in Auto Mapping also supports using any `implicit` or `explicit` Value Type Casts when they exists, e.g: ```csharp struct A { public int Id { get; } public A(int id) => Id = id; public static implicit operator B(A from) => new B(from.Id); } struct B { public int Id { get; } public B(int id) => Id = id; public static implicit operator A(B from) => new A(from.Id); } var b = new A(1).ConvertTo<B>(); ``` ### Powerful and Capable Due to its heavy reliance in [#Script](https://sharpscript.net) and other parts in ServiceStack, the built-in Auto Mapping is a sophisticated implementation that covers a large number of use-cases and corner cases when they can be intuitively mapped. To see a glimpse of its available capabilities check out some of the examples in the docs where it's able to [call any method or construct any type dynamically](/reflection-utils#call-any-method-dynamically) using different Types. Or how it's able to [convert any Reference Type into and out of an Object Dictionary](/reflection-utils#converting-instances-from-an-object-dictionary), providing a simple approach to dynamically manipulating Types. ### Populating Types from an Object Dictionary The `ToObjectDictionary` and `FromObjectDictionary` extension methods are also useful in trying to convert loosely-typed data structures into a Typed POCO's, e.g: ```csharp var dto = new User { FirstName = "First", LastName = "Last", Car = new Car { Age = 10, Name = "ZCar" }, }; Dictionary<string,object> map = dtoUser.ToObjectDictionary(); map["LastName"] = "Updated"; User user = (User)map.FromObjectDictionary(typeof(User)); ``` [1]: http://martinfowler.com/eaaCatalog/dataTransferObject.html [2]: http://msdn.microsoft.com/en-us/library/ff649585.aspx [3]: http://www.palmmedia.de/Blog/2011/8/30/ioc-container-benchmark-performance-comparison [4]: /clients-overview [5]: http://ayende.com/blog/4769/code-review-guidelines-avoid-inheritance-for-properties [6]: https://github.com/AutoMapper/AutoMapper --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/vue/autocomplete.md --- title: Autocomplete Component group: Component Gallery --- The `Autocomplete` component provides a user friendly Input for being able to search and quickly select items with support for partial items view and infinite scrolling. ```html <form class="col-span-12"> <div class="mb-3"> <Autocomplete id="simple" :options="allContacts" v-model="simple" label="Single Contact" :match="(x: any, value: string) => x!.displayName.toLowerCase().includes(value.toLowerCase())" placeholder="Select Contact"> <template #item="{ displayName }"> <span class="block truncate">{{ displayName }}</span> </template> </Autocomplete> <div class="mt-2 flex justify-end"> <p> <b class="text-gray-500">Single:</b> <div v-if="simple" class="flex"> <img :src="simple.profileUrl" class="w-8 h-8 rounded-full mr-2"> <b class="text-lg">{{ simple.displayName }}</b> </div> </p> </div> </div> <div class="mb-3"> <Autocomplete id="contact" :options="allContacts" v-model="contact" label="Single Contact with Icon" :match="(x: any, value: string) => x!.displayName.toLowerCase().includes(value.toLowerCase())" placeholder="Select Contact"> <template #item="{ displayName, profileUrl }"> <div class="flex items-center"> <Icon class="h-6 w-6 flex-shrink-0 rounded-full" :src="profileUrl" loading="lazy" /> <span class="ml-3 truncate">{{ displayName }}</span> </div> </template> </Autocomplete> <div class="mt-2 flex justify-end"> <p> <b class="text-gray-500">Single with Icon:</b> <div v-if="contact" class="flex"> <img :src="contact.profileUrl" class="w-8 h-8 rounded-full mr-2"> <b class="text-lg">{{ contact.displayName }}</b> </div> </p> </div> </div> <div class="mb-3"> <Autocomplete id="contacts" :options="allContacts" v-model="contacts" multiple label="Multiple Contacts with Icon" :match="(x: any, value: string) => x!.displayName.toLowerCase().includes(value.toLowerCase())" placeholder="Select Contact"> <template #item="{ displayName, profileUrl }"> <div class="flex items-center"> <Icon class="h-6 w-6 flex-shrink-0 rounded-full" :src="profileUrl" loading="lazy" /> <span class="ml-3 truncate">{{ displayName }}</span> </div> </template> </Autocomplete> <div class="mt-2"> <div class="text-right"><b class="text-gray-500">Multiple with Icon:</b></div> <p> <div v-if="contacts.length" class="flex flex-wrap"> <div v-for="contact in contacts" class="flex ml-4 mb-2"> <img :src="contact.profileUrl" class="w-6 h-6 rounded-full mr-2"> <span>{{ contact.displayName }}</span> </div> </div> </p> </div> </div> </form> ``` ## Custom Form <div class="not-prose"> <autocomplete-examples class="max-w-prose"></autocomplete-examples> </div> --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ormlite/autogen-t4.md --- title: AutoGen & T4 Templates --- ## AutoQuery AutoGen The recommended way to auto generate Tables and APIs for your existing RDBMS tables is to use [AutoQuery AutoGen](/autoquery/autogen) whose declarative nature allows us to easily generate AutoQuery & Crud Services using just declarative DTOs. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="NaJ7TW-Q_pU" style="background-image: url('https://img.youtube.com/vi/NaJ7TW-Q_pU/maxresdefault.jpg')"></lite-youtube> ## T4 Templates [OrmLite's T4 Template](https://github.com/ServiceStack/ServiceStack.OrmLite/tree/master/src/T4) are useful in database-first development or when wanting to use OrmLite with an existing RDBMS by automatically generating POCO's and strong-typed wrappers for executing stored procedures. :::copy `<PackageReference Include="ServiceStack.OrmLite.T4" Version="10.*" />` ::: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/autoquery/autogen.md --- title: AutoQuery AutoGen CRUD Services --- Long time users of ServiceStack will know it's a staunch proponent of **code-first development** where your C# Types retains the master authority of your App's logic, although there are a number of times where you have to work with existing databases which would require significant effort to create the initial code-first Data Models. ### AutoGen vs TypeScript Data Models This AutoGen-based approach uses runtime C# reflection to inspect your RDBMS schema at startup and dynamically register AutoQuery CRUD Services for your existing tables. By contrast, the [`okai` TypeScript Data Models](/autoquery/okai-db) approach first exports DB metadata and converts it into TypeScript Data Models, then generates the AutoQuery CRUD APIs, RDBMS Data Models and DB migrations from those definitions. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="NaJ7TW-Q_pU" style="background-image: url('https://img.youtube.com/vi/NaJ7TW-Q_pU/maxresdefault.jpg')"></lite-youtube> ### Code Generation of AutoQuery & CRUD Services Now with AutoCrud we can add a lot more value in this area as AutoCrud's declarative nature allows us to easily generate AutoQuery & Crud Services by just emitting declarative Request DTOs. You can then add the generated DTOs to your ServiceModel's to quickly enable AutoQuery Services for your existing databases. <img src="/img/pages/svg/servicify.svg" width="100%"> To enable this feature you you just need to initialize `GenerateCrudServices` in your `AutoQueryFeature` plugin, e.g: ```csharp services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, GenerateCrudServices = new GenerateCrudServices {} }); ``` If you don't have an existing database, you can quickly test this out with a Northwind SQLite database available from [https://github.com/NetCoreApps/NorthwindAuto](https://github.com/NetCoreApps/NorthwindAuto): :::sh x download NetCoreApps/NorthwindAuto ::: As you'll need to use 2 terminal windows, I'd recommend opening the project with **VS Code** which has great multi-terminal support: :::sh code NorthwindAuto ::: ### Register DB Connection The important parts of this project is the registering the OrmLite DB Connection, the above configuration and the local **northwind.sqlite** database, i.e: ```csharp container.AddSingleton<IDbConnectionFactory>(c => new OrmLiteConnectionFactory(MapProjectPath("~/northwind.sqlite"), SqliteDialect.Provider)); ``` When using [Endpoint Routing](/endpoint-routing) the DB Factory also needs to be initialized on `GenerateCrudServices`, the easiest way to do that would be to register AutoQueryFeature plugin in `Configure.Db.cs` e.g: ```csharp public class ConfigureDb : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { var ormLite = services.AddOrmLite(options => options.UseSqlite(connString)); services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, GenerateCrudServices = new GenerateCrudServices { DbFactory = ormLite.DbFactory, } }); // ... }); } ``` ## Export Code-First DTOs After restarting your App with AutoGen's `GenerateCrudServices` enabled you can export the auto-generated APIs and Data Models into code-first C# classes <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="mFyMgg7c3vg" style="background-image: url('https://img.youtube.com/vi/mFyMgg7c3vg/maxresdefault.jpg')"></lite-youtube> ### Generating AutoQuery APIs and Data Models The development experience is essentially the same as [Add ServiceStack Reference](/add-servicestack-reference) where you'll need to run the .NET Core App in 1 terminal: :::sh dotnet run ::: Then use the `x` dotnet tool to download all the AutoQuery & Crud Services for all tables in the configured DB connection: :::sh `x csharp https://localhost:5001 -path /crud/all/csharp` ::: If you're running a mix of autogenerated AutoGen AutoQuery APIs with existing typed AutoQuery APIs, you can just generate the **new** Services with: :::sh x csharp https://localhost:5001 -path /crud/new/csharp ::: Which you can then copy into your ServiceModel project. ::: tip If no schema is provided, a default schema is used which depends on the Dialect Provider, eg SQL Server, PostgreSQL etc. Multiple schema support requires the AutoQueryFeature to be configured with [multiple schemas](#multiple-schemas-and-rdbms-connections). ::: ### Updating Generated Services If your RDBMS schema changes you'd need to restart your .NET App, then you can update all existing `dtos.cs` with: :::sh x csharp ::: i.e. the same experience as updating normal DTOs. ### Switch to code-first dev model After generating `dtos.cs` AutoGen is no longer needed and can now be removed by removing `GenerateCrudServices` ```csharp services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, // GenerateCrudServices = new GenerateCrudServices {} }); ``` All functionality and features can now be added to annotating & enhancing your code-first C# Typed DTOs. ### AutoGen specific options in AutoGen DTOs When using AutoGen to generate Typed AutoQuery DTOs for your RDBMS Tables: :::sh `x csharp https://localhost:5001 -path /crud/all/csharp` ::: The generated `dtos.cs` includes AutoGen-specific options that can be used to maintain multiple custom RDBMS Tables and AutoQuery APIs, e.g. you could maintain one for each of your different RDBMS schemas: ```csharp /* Options: //... //IncludeCrudOperations: Schema: custom //NamedConnection: //NoCache: //IncludeTables: //ExcludeTables: //AuthSecret: */ ``` ### AutoGen Code Generation Customizations AutoGen also supports a number of options to customize its generated code: - **IncludeCrudOperations** `string[]` - Which AutoCrud APIs to include: `Query`, `Create`, `Update`, `Patch`, `Delete` - **Schema** `string` - The RDBMS Schema you want AutoQuery Services generated for - **NamedConnection** `string` - The NamedConnection you want AutoQuery Services generated for - **NoCache** `boolean` - Do not use cached DB Table Schemas, re-fetch latest - **IncludeTables** `string[]` - Allow List to specify only the tables you would like to have code-generated - **ExcludeTables** `string[]` - Block List to exclude tables from code-generation - **AuthSecret** `string` - The Admin AuthSecret to access Service in Release mode For example you can use an empty array to only generate Data Models without any AutoQuery APIs: :::sh x csharp https://localhost:5001 -path /crud/all/csharp -q IncludeCrudOperations=[] ::: Only generate read-only APIs by limiting generation to Query AutoQuery Request DTOs: :::sh x csharp https://localhost:5001 -path /crud/all/csharp -q IncludeCrudOperations=[Query] ::: Only generate Read, Create and Update APIs (i.e. no Delete): :::sh x csharp https://localhost:5001 -path /crud/all/csharp -q IncludeCrudOperations=[Query,Create,Patch] ::: These customized generated classes are also available from your App's built-in endpoints: :::{.table .table-striped} | Type | URL | | --- | --- | | All Services (C#) | `https://localhost:5001/crud/all/csharp` | | Only New Services (C#) | `https://localhost:5001/crud/new/csharp` | | Read Only Services (C#) | `https://localhost:5001/crud/all/csharp?IncludeCrudOperations=[Query]` | | Only Data Models (C#) | `https://localhost:5001/crud/all/csharp?IncludeCrudOperations=[]` | | New Auto Generated Services (C#) | `https://localhost:5001/crud/new/csharp` | | Generate DTOs in alt languages (e.g. TypeScript) | `https://localhost:5001/crud/all/typescript` | ::: ## AutoRegister AutoGen AutoQuery Services To recap we've now got an integrated scaffolding solution where we can quickly generate code-first AutoQuery Services and integrate them into our App to quickly build an AutoQuery Service layer around our existing database. But we can raise the productivity level even higher by instead of manually importing the code-generated Services into our project we just tell ServiceStack to do it for us. This is what the magical `AutoRegister` flag does for us: ```csharp var ormLite = services.AddOrmLite(options => options.UseSqlite(connString)); services.AddPlugin(new AutoQueryFeature { GenerateCrudServices = new GenerateCrudServices { DbFactory = ormLite.DbFactory, AutoRegister = true, //.... } }); ``` ### Instantly Servicify Northwind DB with gRPC To show the exciting potential of this feature we'll demonstrate one valuable use-case of creating a [grpc](/grpc/) project, mixing in AutoQuery configuration to instantly Servicifying the Northwind DB, browsing the generated Services from ServiceStack's [Metadata Page](/metadata-page), explore the gRPC RPC Services `.proto` then create a new Dart App to consume the gRPC Services: > YouTube: [youtu.be/5NNCaWMviXU](https://youtu.be/5NNCaWMviXU) [![](/img/pages/release-notes/v5.9/autogen-grpc.png)](https://youtu.be/5NNCaWMviXU) #### Step-by-step Guide See the annotated guide below to follow along: Create a new [grpc](https://github.com/NetCoreTemplates/grpc) .NET Core project and open it in VS Code: :::sh npx create-net grpc NorthwindApi code NorthwindApi ::: Inside VS Code open a Terminal Window and [mix in](/mix-tool) the required configuration: :::sh cd NorthwindApi npx add-in autocrudgen sqlite northwind.sqlite ::: Which will mix in the [autocrudgen](https://gist.github.com/gistlyn/464a80c15cb3af4f41db7810082dc00c) gist to enable AutoQuery and tell it to Auto Generate AutoQuery and CRUD Services for all tables in the registered RDBMS (default schema): ```csharp public class ConfigureDb : IHostingStartup { public void Configure(IWebHostBuilder builder) { builder.ConfigureServices((context,services) => { var ormLite = services.AddOrmLite(options => options.UseSqlite(connString)); services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, GenerateCrudServices = new GenerateCrudServices { DbFactory = ormLite.DbFactory, AutoRegister = true } }); }); } } ``` The [sqlite](https://gist.github.com/gistlyn/768d7b330b8c977f43310b954ceea668) gist registers an [OrmLite.Sqlite](/ormlite/) RDBMS connection with our App which we want to configure to connect to a **northwind.sqlite** database: ```csharp public void Configure(IServiceCollection services) { services.AddSingleton<IDbConnectionFactory>(new OrmLiteConnectionFactory( Configuration.GetConnectionString("DefaultConnection"), SqliteDialect.Provider)); } ``` Then we apply the [northwind.sqlite](https://gist.github.com/gistlyn/97d0bcd3ebd582e06c85f8400683e037) gist to add the **northwind.sqlite** database to our new project. Now that our App's configured we can run it with: :::sh dotnet run ::: Where it will start the ServiceStack gRPC App on 3 ports configured in **appsettings.json**: - `5001` - Enables access from existing HTTP/1.1 clients and proxies - `5002` - Enables a secure gRPC Channel - `5003` - Enables an insecure gRPC Channel ```json { "Kestrel": { "Endpoints": { "Https": { "Url": "https://*:5001", "Protocols": "Http1" }, "GrpcSecure": { "Url": "https://*:5051", "Protocols": "Http2" }, "GrpcInsecure" : { "Url": "http://*:5054", "Protocols": "Http2" } } } } ``` Once running you can view your Apps metadata page at `https://localhost:5001` to inspect all the Services that were generated. #### Create Dart gRPC Console App It's also now accessible via [ServiceStack's gRPC endpoint](/grpc/) which opens your generated Services up to [Google's high-performance gRPC ecosystem](https://grpc.io) which enables typed, high-performance integrations into exciting platforms like [Flutter](https://flutter.dev) which uses the [Dart](https://dart.dev) programming language to create Reactive, high-performance native Android and iOS Apps. We can test Dart's gRPC integration and development workflow in a new Dart Console App we can create with: ```bash mkdir dart-grpc && cd dart-grpc pub global activate stagehand stagehand console-full ``` We'll need to update **pubspec.yaml** with the required gRPC dependencies: ```yaml dependencies: fixnum: ^0.10.11 async: ^2.2.0 protobuf: ^1.0.1 grpc: ^2.1.3 ``` When you save **pubspec.yaml** Dart's VS Code extension will automatically fetch any new dependencies which can also be manually run with: :::sh pub get ::: We can then use the [protoc support in the dotnet tools](/grpc#public-grpc-protoc-service-and-ui) to download our `.proto` Services descriptor and generate Dart's gRPC classes with a single command: :::sh `x proto-dart https://localhost:5001 -out lib` ::: We're now all set to consume our gRPC Services using the protoc generated gRPC proxy in our `main()` function in **main.dart**: ```dart import 'dart:io'; import 'package:grpc/grpc.dart'; import 'package:dart_grpc/services.pb.dart'; import 'package:dart_grpc/services.pbgrpc.dart'; void main(List<String> arguments) async { var client = GrpcServicesClient(ClientChannel('localhost', port:5054, options:ChannelOptions(credentials: ChannelCredentials.insecure()))); var response = await client.getQueryCategory(QueryCategory()); print(response.results); exit(0); } ``` Which can be run with: :::sh dart bin\main.dart ::: ### Calling gRPC SSL Services The [Dart gRPC Docs](/grpc/dart#dart-protoc-grpc-ssl-example) shows how we can connect to it via our gRPC SSL endpoint by running the openssl scripts in [grpc/scripts](https://github.com/NetCoreTemplates/grpc/tree/master/scripts) to generate our **dev.crt** and **prod.crt** SSL Certificates that you can configure in your in your **GrpcSecure** endpoint with: ```json { "Kestrel": { "Endpoints": { "GrpcSecure": { "Url": "https://*:5051", "Protocols": "Http2", "Certificate": { "Path": "dev.pfx", "Password": "grpc" } } } } } ``` Where you'll then be able to access the secure gRPC SSL endpoints using the generated **dev.crt** certificate in your Dart App: ```dart import 'dart:io'; import 'package:grpc/grpc.dart'; import 'package:dart_grpc/services.pb.dart'; import 'package:dart_grpc/services.pbgrpc.dart'; GrpcServicesClient createClient({CallOptions options}) { return GrpcServicesClient(ClientChannel('localhost', port:5051, options:ChannelOptions(credentials: ChannelCredentials.secure( certificates: File('dev.crt').readAsBytesSync(), authority: 'localhost'))), options:options); } void main(List<String> args) async { var client = createClient(); var response = await client.getQueryCategory(QueryCategory()); print(response.results); exit(0); } ``` ### AutoGen's AutoRegister Implementation Whilst the `AutoRegister = true` flag on its face may seem magical, it's simply an instruction that tells ServiceStack to register the **new** AutoQuery Services it already knows about and register them as if they were normal code-first Services that we had written ourselves. More accurately, behind-the-scenes it uses the Metadata Type structure it constructed in generating the Services & Types, i.e. the same Types used to project into its Add ServiceStack Reference's generated C#, TypeScript, (and other languages) which are also the same Types that are manipulated when customizing code-generation, gets used to generate .NET Types in memory on Startup with Reflection.Emit. Barring any issues with the projection into IL, externally the end result is indistinguishable to a normal code-first ServiceStack Service manually created by a developer - An important point as to why these solutions compose well with the rest of ServiceStack, just as an AutoQuery Service is a normal ServiceStack Service, these auto generated & auto registered ServiceStack Services are regular Auto Query Services. The primary difference is that they only exist in a .NET Assembly in memory created on Startup, not in code so they're not "statically visible" to a C# compiler, IDE, tools, etc. But otherwise they're regular typed ServiceStack Services and can take advantage of the ecosystem around Services including [Add ServiceStack Reference](/add-servicestack-reference) & other Metadata Pages and Services, etc. ### CreateCrudServices Instructions Peeking deeper behind the `AutoRegister` flag will reveal that it's a helper for adding an empty `CreateCrudServices` instance, i.e. it's equivalent to: ```csharp var ormLite = services.AddOrmLite(options => options.UseSqlite(connString)); services.AddPlugin(new AutoQueryFeature { GenerateCrudServices = new GenerateCrudServices { DbFactory = ormLite.DbFactory, CreateServices = { new CreateCrudServices() } //.... } }); ``` #### Multiple Schemas and RDBMS Connections This instructs ServiceStack to generate Services for the default option, i.e. all tables in the Database of the default registered Database connection and default schema. Although should you wish to, you can also generate Services for multiple Databases and RDBMS Schemas within the same App. With this you could have a single API Gateway Servicifying access to multiple System RDBMS Tables & Schemas, e.g: ```csharp var ormLite = services.AddOrmLite(options => options.UseSqlite(connString)); services.AddPlugin(new AutoQueryFeature { GenerateCrudServices = new GenerateCrudServices { DbFactory = ormlite.DbFactory, CreateServices = { new CreateCrudServices(), new CreateCrudServices { Schema = "AltSchema" }, new CreateCrudServices { NamedConnection = "Reporting" }, new CreateCrudServices { NamedConnection = "Reporting", Schema = "AltSchema" }, } //.... } }); ``` These will generated Service Contracts & DTO Types with the Multitenancy [NamedConnection](/autoquery/rdbms#named-connection) & OrmLite `[Schema]` attribute required for routing AutoQuery Services to use the appropriate RDBMS connection of Schema. Although there are potential conflicts if there are identical table names in each RDBMS/Schema as it has to go back and rewrite the Metadata References to use a non-ambiguous name, first tries using the NamedConnection, then the schema then a combination when both exists, if it's still ambiguous it gives up and ignores it. If you do run into conflicts, the recommendation is to "eject" the generated `.cs` sources and manually update them to use your preferred unique names. ### Customize Code Generation to include App Conventions Being able to instantly generate AutoQuery Services for all your RDBMS tables is nice, but it's even nicer if you could easily customize the code-generation! Together with the flexibility of the new declarative validation support you can compose a surprisingly large amount of your App's logic using the versatility of C# to automate embedding your App's conventions by annotating them on declarative Request DTOs. The existing code-generation already infers a lot from your RDBMS schema which you can further augment using the available `GenerateCrudServices` filters: - `ServiceFilter` - called with every Service Operation - `TypeFilter` - called with every DTO Type - `IncludeService` - a predicate to return whether the **Service** should be included - `IncludeType` - a predicate to return whether the **Type** should be included - `TableSchemasFilter` - an action to modify the `List<TableSchema>` that AutoGen uses to generate data models For an illustration of this in action, here's a typical scenario of how the Northwind AutoQuery Services could be customized: - Controlling which Tables **not to generate Services for** in `ignoreTables` - Which tables not to generate **Write Crud Services** for in `readOnlyTables` - Which tables to **restrict access** to in different roles in `protectTableByRole` - Example of **additional validation** to existing tables in `tableRequiredFields` - Adds the `[ValidateNotEmpty]` attribute to Services accessing the table and the `[Required]` OrmLite attribute for the Data Model DTO Type. ```csharp var ignoreTables = new[] { "IgnoredTable", }; // don't generate AutoCrud APIs for these tables var readOnlyTables = new[] { "Region" }; var protectTableByRole = new Dictionary<string,string[]> { ["Admin"] = new[] { nameof(CrudEvent), nameof(ValidationRule) }, ["Accounts"] = new[] { "Order", "Supplier", "Shipper" }, ["Employee"] = new[] { "Customer", "Order", "OrderDetail" }, ["Manager"] = new[] { "Product", "Category", "Employee", "UserAuth", "UserAuthDetails" }, }; var tableRequiredFields = new Dictionary<string,string[]> { ["Shipper"] = new[]{ "CompanyName", "Phone" }, }; services.AddPlugin(new AutoQueryFeature { MaxLimit = 100, GenerateCrudServices = new GenerateCrudServices { DbFactory = dbFactory, ServiceFilter = (op,req) => { // Require all Write Access to Tables to be limited to Authenticated Users if (op.IsCrudWrite()) { op.Request.AddAttributeIfNotExists(new ValidateRequestAttribute("IsAuthenticated"), x => x.Validator == "IsAuthenticated"); } // Limit Access to specific Tables foreach (var tableRole in protectTableByRole) { foreach (var table in tableRole.Value) { if (op.ReferencesAny(table)) op.Request.AddAttribute(new ValidateHasRoleAttribute(tableRole.Key)); } } // Add [ValidateNotEmpty] attribute on Services operating Tables with Required Fields if (op.DataModel != null && tableRequiredFields.TryGetValue(op.DataModel.Name, out var required)) { var props = op.Request.Properties.Where(x => required.Contains(x.Name)); props.Each(x => x.AddAttribute(new ValidateNotEmptyAttribute())); } }, TypeFilter = (type, req) => { // Add OrmLite [Required] Attribute on Tables with Required Fields if (tableRequiredFields.TryGetValue(type.Name, out var requiredFields)) { var props = type.Properties.Where(x => requiredFields.Contains(x.Name)); props.Each(x => x.AddAttribute(new RequiredAttribute())); } }, //Don't generate the Services or Types for Ignored Tables IncludeService = op => !ignoreTables.Any(table => op.ReferencesAny(table)) && !(op.IsCrudWrite() && readOnlyTables.Any(table => op.ReferencesAny(table))), IncludeType = type => !ignoreTables.Contains(type.Name), } }); ``` Additionally, the `TableSchemasFilter` can be used to modify the schema used by AutoGen to generate the types associated with your AutoQuery APIs. This gives you the opportunity to filter or modify the schema after they are pulled from the database. For example, we could `Remove` tables based on naming, or alter column definitions to assist with any schema issues. ```csharp var ormLite = services.AddOrmLite(options => options.UseSqlite(connString)); GenerateCrudServices = new GenerateCrudServices { DbFactory = ormLite.DbFactory, AutoRegister = true, AddDataContractAttributes = false, TableSchemasFilter = tableSchemas => { // Don't include tables starting with an underscore tableSchemas.RemoveAll(tableSchema => tableSchema.Name.StartsWith("_")); // Don't include columns of "geometry" type. tableSchema.Columns.ToList().RemoveAll(x => x.DataTypeName == "geometry"); } } ``` To assist in code-generation a number of high-level APIs are available to help with identifying Services, e.g: - `operation.IsCrud()` - Is read-only AutoQuery or AutoCrud write Service - `operation.IsCrudWrite()` - Is AutoCrud write Service - `operation.IsCrudRead()` - Is AutoQuery read-only Service - `operation.ReferencesAny()` - The DTO Type is referenced anywhere in the Service (e.g. Request/Response DTOs, Inheritance, Generic Args, etc) - `type.InheritsAny()` - The DTO inherits any of the specified type names - `type.ImplementsAny()` - The DTO implements any of the specified interface type names ### Mixing generated AutoQuery Services & existing code-first Services The expected use-case for these new features is that you'd create a new project that points to an existing database to bootstrap your project with code-first AutoQuery Services using the dotnet tool to download the generated types, i.e: :::sh `x csharp https://localhost:5001 -path /crud/all/csharp` ::: At which point you'd "eject" from the generated AutoQuery Services (forgetting about this feature), copy the generated types into your **ServiceModel** project and continue on development as code-first Services just as if you'd created the Services manually. But the `GenerateCrudServices` feature also supports a "hybrid" mode where you can also just generate Services for any **new** AutoQuery Services that don't exist, i.e. for tables for which there are no existing services which you can access their generated Services from: :::sh `x csharp https://localhost:5001 -path /crud/new/csharp` ::: The existing `/crud/all/csharp` Service continues to return generated Services for all Tables but will stitch together and use existing types where they exist. > If your new code first services are missing, remember to make sure your AppHost is scanning the assembly they belong to by using `typeof(MyGeneratedType).Assembly` in your `AppHost` base constructor. ### Trying it out We now have all the features we need to quickly servicify an existing database that we can easily customize to apply custom App logic to further protect & validate access. So you can quickly explore these new features locally, you can download the enhanced Northwind example with this customization above in the new [github.com/NetCoreApps/NorthwindCrud](https://github.com/NetCoreApps/NorthwindCrud) project which you can download & run with: ```bash x download NetCoreApps/NorthwindCrud cd NorthwindCrud dotnet run ``` This example App is also configured with other new features in incoming release including Crud Events in [Startup.cs](https://github.com/NetCoreApps/NorthwindCrud/blob/master/Startup.cs): ```csharp // Add support for auto capturing executable audit history for AutoCrud Services container.AddSingleton<ICrudEvents>(c => new OrmLiteCrudEvents(c.Resolve<IDbConnectionFactory>())); container.Resolve<ICrudEvents>().InitSchema(); ``` As well as support for dynamically generated db rules in [Configure.Validation.cs](https://github.com/NetCoreApps/NorthwindCrud/blob/master/Configure.Validation.cs): ```csharp services.AddSingleton<IValidationSource>(c => new OrmLiteValidationSource(c.Resolve<IDbConnectionFactory>())); appHost.Resolve<IValidationSource>().InitSchema(); ``` To be able to test the custom code generation the example is pre-populated with 3 users with different roles in [Configure.Auth.cs](https://github.com/NetCoreApps/NorthwindCrud/blob/master/Configure.Auth.cs): ```csharp // Register Users that don't exist void EnsureUser(string email, string name, string[] roles=null) { if (authRepo.GetUserAuthByUserName(email) != null) return; authRepo.CreateUserAuth(new UserAuth { Email = email, DisplayName = name, Roles = roles?.ToList(), }, password:"p@ss"); } EnsureUser("employee@gmail.com", name:"A Employee", roles:new[]{ "Employee" }); EnsureUser("accounts@gmail.com", name:"Account Dept", roles:new[]{ "Employee", "Accounts" }); EnsureUser("manager@gmail.com", name:"The Manager", roles:new[]{ "Employee", "Manager" }); ``` Of which you can also find published on [NorthwindCrud's home page](https://github.com/NetCoreApps/NorthwindCrud). ### AutoGen Customizations AutoGen's [Instantly Servicify existing Systems](/servicify) feature works by automatically generating the AutoQuery & Crud APIs and Data Models for all tables in the configured RDBMS's. Further customization of the DataModel Names, the user-defined Route Path they're hosted at & the name of individual AutoQuery APIs for each operation using the `GenerateOperationsFilter`. So if you had an existing table name called `applications` the default convention based names would be: - Data Model: `Applications` - APIs: `CreateApplications`, `PatchApplications`, `QueryApplications`, etc - Route: `/applications`, `/applications/{Id}` You can change each of these default conventions with the new `GenerateOperationsFilter`, e.g: ```csharp var ormLite = services.AddOrmLite(options => options.UseSqlite(connString)); services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, GenerateCrudServices = new GenerateCrudServices { DbFactory = ormLite.DbFactory, AutoRegister = true, GenerateOperationsFilter = ctx => { if (ctx.TableName == "applications") { ctx.DataModelName = "Application"; ctx.PluralDataModelName = "Apps"; ctx.RoutePathBase = "/apps"; ctx.OperationNames = new Dictionary<string, string> { [AutoCrudOperation.Create] = "CreateApp", [AutoCrudOperation.Patch] = "ModifyApp", }; } } } }); ``` Would result in: - Data Model: `Application` - APIs: `QueryApps`, `CreateApp`, `ModifyApp` - Route: `/apps`, `/apps/{Id}` ### Retrying Dart gRPC Example We can see an immediate effect of these customizations in **NorthwindCrud** where most APIs now require Authentication: ![](/img/pages/release-notes/v5.9/northwindcrud-metadata.png) If we then try to run our Dart `main.dart` example against the customized **NorthwindCrud** APIs by first regenerating gRPC protoc Types: :::sh `x proto-dart https://localhost:5001 -out lib` ::: Then try rerunning `main.dart` where it will now fail with an **Unauthorized** exception: ![](/img/pages/release-notes/v5.9/northwindcrud-noauth.png) To now be able to access most Services we'll need to [Authenticate as registered user](/grpc/dart#dart-grpc-authenticated-request-example). As NorthwindCrud is [configured to use JWT](https://github.com/NetCoreApps/NorthwindCrud/blob/master/Configure.Auth.cs) we can create an Authenticated gRPC client by adding the populated JWT Token from an Authenticated Request into the **Authorization** gRPC metadata Header: ```dart GrpcServicesClient createClient({CallOptions options}) { return GrpcServicesClient(ClientChannel('localhost', port:5054, options:ChannelOptions(credentials: ChannelCredentials.insecure())), options:options); } void main(List<String> arguments) async { var authResponse = await createClient().postAuthenticate( Authenticate()..provider='credentials'..userName='manager@gmail.com'..password='p@ss'); var authClient = createClient(options:CallOptions(metadata:{ 'Authorization': 'Bearer ${authResponse.bearerToken}' })); var response = await authClient.getQueryCategory(QueryCategory()); print(response.results); exit(0); } ``` Now when we rerun `main.dart` we'll be able to access our Northwind categories again: ![](/img/pages/release-notes/v5.9/northwindcrud-jwtauth.png) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/redis/automatic-retries.md --- slug: automatic-retries title: Automatic Retries --- One feature that improves the resilience of `RedisClient` connections is Auto Retry where the RedisClient will transparently retry failed Redis operations due to Socket and I/O Exceptions in an exponential backoff starting from **10ms** up until the `RetryTimeout` of **10000ms**. These defaults can be tweaked with: ## Usage ```csharp RedisConfig.DefaultRetryTimeout = 10000; RedisConfig.BackOffMultiplier = 10; ``` The `RetryTimeout` can also be configured on the connection string with `?RetryTimeout=10000`. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/vue/autoquerygrid.md --- title: AutoQueryGrid Component group: Component Gallery --- <div class="flex justify-center"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="znCoC-Ct0Ps" style="background-image: url('https://img.youtube.com/vi/znCoC-Ct0Ps/maxresdefault.jpg')"></lite-youtube> </div> <api-reference component="AutoQueryGrid<Model>"></api-reference> ## Default CRUD By default you can create an AutoQueryGrid that allows authorized users the ability to Create, Read, Update & Delete records with just the DataModel, e.g: ```html <AutoQueryGrid type="Booking" /> ``` <div class="not-prose prose-table"> <auto-query-grid type="Booking"></auto-query-grid> </div> This will utilize your App's existing [AutoQuery APIs](/autoquery/rdbms) for the specified DataModel to enable its CRUD functionality. <api-reference component="AutoQueryGrid<Model>"></api-reference> ## Read Only You can use `apis` to limit which AutoQuery APIs AutoQueryGrid should use, so if only the AutoQuery DTO is provided, the AutoQueryGrid will only be browsable in **read-only** mode: ```html <AutoQueryGrid type="Booking" apis="QueryBookings" /> ``` <div class="not-prose prose-table"> <auto-query-grid type="Booking" apis="QueryBookings"></auto-query-grid> </div> <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Table Styles </h2> The same [DataGrid Table Styles](/vue/datagrid#table-styles) can also be used to style AutoQueryGrid, e.g: ```html <AutoQueryGrid type="Booking" tableStyle="verticalLines,uppercaseHeadings" /> ``` <div class="not-prose prose-table"> <auto-query-grid type="Booking" tableStyle="verticalLines,uppercaseHeadings"></auto-query-grid> </div> **Custom Styles** The AutoQueryGrid's appearance is further customizable with the property classes & functions below: ```ts defineProps<{ toolbarButtonClass: string tableStyle: "simple" | "fullWidth" | "stripedRows" | "whiteBackground" | "uppercaseHeadings" | "verticalLines" gridClass: string grid2Class: string grid3Class: string grid4Class: string tableClass: string theadClass: string tbodyClass: string theadRowClass: string theadCellClass: string rowClass:(model:any,i:number) => string rowStyle:(model:any,i:number) => StyleValue }>() ``` <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Custom AutoQueryGrid </h2> Different AutoQueryGrid features can be hidden with `hide` and functionality disabled with `deny`, e.g: ```html <AutoQueryGrid type="Booking" hide="pagingNav,copyApiUrl,downloadCsv" deny="filtering" /> ``` <div class="not-prose prose-table"> <auto-query-grid type="Booking" hide="pagingNav,copyApiUrl,downloadCsv" deny="filtering"></auto-query-grid> </div> Features that can be hidden and disabled include: ```ts defineProps<{ deny: "filtering" | "queryString" | "queryFilters" hide: "toolbar" | "preferences" | "pagingNav" | "pagingInfo" | "downloadCsv" | "refresh" | "copyApiUrl" | "filtersView" | "newItem" | "resetPreferences" }>() ``` <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Global AutoQueryGrid Configuration </h2> These features can also be disabled at a global level, applying to all `<AutoQueryGrid>` components with [setConfig](/vue/use-config), e.g: ```js const { setAutoQueryGridDefaults } = useConfig() setAutoQueryGridDefaults({ hide: ['pagingNav','copyApiUrl','downloadCsv'] }) ``` <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Limit Columns </h2> By default AutoQueryGrid displays all public properties returned in its AutoQuery API which can be further limited with `selected-columns`: ```html <AutoQueryGrid type="Booking" selectedColumns="id,name,roomType,roomNumber,cost" /> ``` <div class="not-prose prose-table"> <auto-query-grid type="Booking" selectedColumns="id,name,roomType,roomNumber,cost"></auto-query-grid> </div> <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Simple Responsive Columns </h2> Using `visible-from` is a simple way to enable a responsive DataGrid by specifying at which [Tailwind breakpoints](https://tailwindcss.com/docs/responsive-design) columns should be visible from and `header-titles` to use friendlier aliases for different columns, e.g: ```html <AutoQueryGrid type="Booking" selectedColumns="id,name,roomType,roomNumber,cost,bookingStartDate,bookingEndDate" :headerTitles="{ roomNumber:'Room', bookingStartDate:'Start', bookingEndDate:'End' }" :visibleFrom="{ bookingStartDate:'lg', bookingEndDate:'xl' }" /> ``` <div class="not-prose prose-table"> <auto-query-grid type="Booking" selectedColumns="id,name,roomType,roomNumber,cost,bookingStartDate,bookingEndDate" :headerTitles="{ roomNumber:'Room', bookingStartDate:'Start', bookingEndDate:'End' }" :visibleFrom="{ bookingStartDate:'lg', bookingEndDate:'xl' }"></auto-query-grid> </div> <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Custom Responsive Columns </h2> Which columns are displayed and how they're formatted are further customizable with `<template #column>` slots: ```html <AutoQueryGrid type="Booking" :visibleFrom="{ name:'xl', bookingStartDate:'sm', bookingEndDate:'xl', createdBy:'2xl' }"> <template #id="{ id }"> <span class="text-gray-900" v-html="id"></span> </template> <template #name="{ name }">{{name}}</template> <template #roomNumber-header> <span class="hidden lg:inline">Room </span>No </template> <template #cost="{ cost }"> <span v-html="currency(cost)"></span> </template> <template #bookingStartDate-header> Start<span class="hidden lg:inline"> Date</span> </template> <template #bookingEndDate-header> End<span class="hidden lg:inline"> Date</span> </template> <template #createdBy-header> Employee </template> <template #createdBy="{ createdBy }" v-html="createdBy"></template> </AutoQueryGrid> ``` <responsive class="not-prose mb-4"></responsive> <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Custom Functionality </h2> The column template slots can be leveraged to implement custom functionality, e.g. instead of navigating to separate pages to manage related data we can use a custom column to manage Booking Coupons from within the same grid, e.g: ```html <AutoQueryGrid type="Booking" selectedColumns="id,name,cost,bookingStartDate,bookingEndDate,discount"> <template #discount="{ discount }"> <TextLink v-if="discount" class="flex items-end" @click.stop="showCoupon(discount.id)" :title="discount.id"> <Icon class="w-5 h-5 mr-1" type="Coupon" /> <PreviewFormat :value="discount.description" /> </TextLink> </template> </AutoQueryGrid> <AutoEditForm v-if="coupon" type="UpdateCoupon" v-model="coupon" @done="close" @save="close" /> <script setup lang="ts"> import { ref } from "vue" import { useClient } from "@servicestack/vue" import { QueryCoupons } from "dtos" const client = useClient() const coupon = ref() async function showCoupon(id:string) { const api = await client.api(new QueryCoupons({ id })) if (api.succeeded) { coupon.value = api.response!.results[0] } } const close = () => coupon.value = null </script> ``` <custom-booking class="not-prose"></custom-booking> <h2 class="pt-8 mb-4 text-2xl font-semibold text-gray-900 dark:text-gray-100"> Data Reference Labels </h2> [AutoQuery](/autoquery/rdbms) is able to infer relationships from the [POCO References](/ormlite/reference-support) of your Data Models where if your DataModel includes `[Reference]` attributes so that its related Data is returned in your AutoQuery APIs, AutoQueryGrid will be able to make use of it to render the Contacts & Job Names and Icons instead of just the plain Foreign Key Ids. An example of this in the [JobApplications](https://blazor-gallery.servicestack.net/locode/QueryJobApplications) DataModel DTO: ```csharp [Icon(Svg = Icons.Application)] public class JobApplication : AuditBase { [AutoIncrement] public int Id { get; set; } [References(typeof(Job))] public int JobId { get; set; } [References(typeof(Contact))] public int ContactId { get; set; } [Reference] [Format(FormatMethods.Hidden)] public Job Position { get; set; } [Reference] [Format(FormatMethods.Hidden)] public Contact Applicant { get; set; } [Reference] public List<JobApplicationComment> Comments { get; set; } public DateTime AppliedDate { get; set; } public JobApplicationStatus ApplicationStatus { get; set; } //... } ``` Which AutoQueryGrid uses to automatically display the Job and Contact name instead of their ids: ```html <AutoQueryGrid type="JobApplication" :prefs="{take:5}" /> ``` <div class="not-prose prose-table"> <auto-query-grid type="JobApplication" :prefs="{take:5}"></auto-query-grid> </div> With the original ids are discoverable by hovering over the Job & Contact labels. ## Reference Fields By default AutoQuery will infer using the first string column of the related table for its label, this information can also be explicitly defined with the `[Ref]` attribute, e.g: ```csharp public class JobApplication : AuditBase { [AutoIncrement] public int Id { get; set; } [References(typeof(Job))] [Ref(Model=nameof(Job), RefId=nameof(Job.Id), RefLabel=nameof(Job.Title))] public int JobId { get; set; } [References(typeof(Contact))] [Ref(Model=nameof(Contact), RefId=nameof(Contact.Id), RefLabel=nameof(Contact.DisplayName))] public int ContactId { get; set; } //... } ``` Alternatively you can use `[Ref(None=true)]` to disable any implicit inferences and render the FK property Ids as-is. When displaying referential data you can tell AutoQueryGrid to hide rendering the complex data references as well columns using `[Format(FormatMethods.Hidden)]`. ## AutoQueryGrid Template Slots AutoQueryGrid supports a number of [Vue slots](https://vuejs.org/guide/components/slots.html) to customize its built-in UIs, including `formheader` and `formfooter` slots to insert custom content before and after the Auto Create & Edit components forms: ```html <template #formheader="{ form, type, apis, model, id }"> <template #formfooter="{ form, type, apis, model, id }"> ``` This feature is used to implement [Locode's Audit History UI](/locode/auditing) for displaying the Audit History of each record in the bottom of the Edit Form for Authorized Users, implemented with: ```html <AutoQueryGrid :key="store.opDataModel" ref="grid" :type="store.opDataModel"> <template #formfooter="{ form, type, apis, model, id }"> <AuditEvents v-if="form === 'edit' && canAccessCrudEvents" class="mt-4" :key="id" :type="type" :id="id" /> </template> </AutoQueryGrid> ``` Which loads the [AuditEvents.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack/modules/locode/components/AuditEvents.mjs) component at the bottom of **Edit** forms, allowing Admin Users to inspect the Audit History of each record: [![](/img/pages/vue/audit-history-job.png)](/locode/auditing) Alternatively you can replace the entire Create and Edit Forms used with the `createform` and `editforms` slots: ```html <template #createform="{ form, apis, type }"> <template #editform="{ form, apis, type }"> ``` Additional toolbar buttons can be added with the `toolbarbuttons` slot, e.g: ```html <template #toolbarbuttons="{ toolbarButtonClass }"> <div class="pl-2 mt-1"> <button type="button" @click="customAction" :class="toolbarButtonClass"> <span class="whitespace-nowrap">My Action</span> </button> </div> </template> ``` Alternatively you can replace the entire toolbar with your own with: ```html <template #toolbar> ``` All other template slots are passed down to the embedded [DataGrid](/vue/datagrid) component where they can be used to customize column headers and cells. ## AutoQueryGrid Properties Additional customizations available using AutoQueryGrid properties include: ```ts defineProps<{ filterDefinitions?: AutoQueryConvention[] id?: string apis?: string|string[] type?: string|InstanceType<any>|Function prefs?: ApiPrefs deny?: string|GridAllowOptions|GridAllowOptions[] hide?: string|GridShowOptions|GridShowOptions[] selectedColumns?:string[]|string toolbarButtonClass?: string tableStyle?: TableStyleOptions gridClass?: string grid2Class?: string grid3Class?: string grid4Class?: string tableClass?: string theadClass?: string tbodyClass?: string theadRowClass?: string theadCellClass?: string headerTitle?:(name:string) => string headerTitles?: {[name:string]:string} visibleFrom?: {[name:string]:Breakpoint} rowClass?:(model:any,i:number) => string rowStyle?:(model:any,i:number) => StyleValue | undefined apiPrefs?: ApiPrefs canFilter?:(column:string) => boolean disableKeyBindings?:(column:string) => boolean configureField?: (field:InputProp) => void skip?: number create?: boolean edit?: string|number }>() type ApiPrefs = { take?: number selectedColumns?: string[] } type FormStyle = "slideOver" | "card" type TableStyle = "simple" | "fullWidth" | "stripedRows" | "whiteBackground" | "uppercaseHeadings" | "verticalLines" type Breakpoint = "xs" | "sm" | "md" | "lg" | "xl" | "2xl" type GridAllowOptions = "filtering" | "queryString" | "queryFilters" type GridShowOptions = "toolbar" | "preferences" | "pagingNav" | "pagingInfo" | "downloadCsv" | "refresh" | "copyApiUrl" | "resetPreferences" | "filtersView" | "newItem" ``` ## AutoQueryGrid Events Whilst the `headerSelected` and `rowSelected` events can be used to invoke custom functionality when column headers and rows are selected: ```ts defineEmits<{ (e: "headerSelected", name:string, ev:Event): void (e: "rowSelected", item:any, ev:Event): void }>() ``` ## Powers Locode AutoQueryGrid is already used extensively and is the key component that enables [Locode's](/locode/) Instant Auto UI to manage your App's AutoQuery CRUD APIs. [![](/img/pages/vue/blazor-gallery-contact.png)](/locode/) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/aws-getting-started.md --- title: Getting started with AWS + ServiceStack --- ![](https://raw.githubusercontent.com/ServiceStack/Assets/5744efd80015870e6441cef6e8fd8bbc79044945/img/aws/servicestack-aws-banner.png) The [ServiceStackApps/AwsGettingStarted](https://github.com/ServiceStackApps/AwsGettingStarted) repository contains Visual Studio solutions for getting started with AWS and ServiceStack libraries, as well as step by step guides to get you started below. ## Amazon RDS - [PostgreSQL](#getting-started-with-aws-rds-postgresql-and-ormlite) - [Aurora](#getting-started-with-aws-rds-aurora-and-ormlite) - [MySQL](#getting-started-with-aws-rds-mysql-and-ormlite) - [MariaDB](#getting-started-with-aws-rds-mariadb-and-ormlite) - [SQL Server](#getting-started-with-aws-rds-sql-server-and-ormlite) ## Amazon ElastiCache - [Redis](#getting-started-with-aws-elasticache-redis-and-servicestack) - [Memcached](#getting-started-with-aws-elasticache-and-servicestack) ## Getting Started with AWS RDS PostgreSQL and OrmLite ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-postgres-powered-by-aws.png) ServiceStack.OrmLite library has support for use with a [PostgreSQL](http://www.postgresql.org/) database via the [`ServiceStack.OrmLite.PostgreSQL`](https://www.nuget.org/packages/ServiceStack.OrmLite.PostgreSQL/) NuGet package. This can be used in conjunction with Amazon's RDS service using PostgreSQL. To get started, first you will need to create your PostgreSQL database via the AWS RDS service. # Creating a PostgreSQL RDS Instance 1. Login to the [AWS Web console](https://console.aws.amazon.com/console/home). 2. Select [RDS](https://console.aws.amazon.com/rds/home) from the **Services** from the top menu. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aws-rds-menu.png) 3. Select **Instances** from the **RDS Dashboard** and click **Launch DB Instance**. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/launch-db-dashboard.png) The above steps will start the RDS Wizard to launch a new DB instance. To setup a new PostgreSQL instance, follow the wizard selecting the appropriate options for your application. As an example, we can create a `Customers` database for a non-production environment. - **Select Engine** - Select PostgreSQL - **Production?** - Select `No` for multi-instance/production setup - **Specify DB Details** - Create a `db.t2.micro` instance with default settings - Specify **Multi-AZ Deployment** as `No` ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/postgres-default-details.png) - Specify **DB Instance Identifier**, eg `servicestack-example-customers`. - Specify **Master Username**, eg `postgres`. - Create and confirm master user password. - **Configure Advanced Settings** - Leave the suggested settings and specify a database name, eg `customers`. This will be used in your connection string. ::: info Problems can occur if your default VPC is not setup to DNS Resolution and/or DNS Hostname. Navigate to **Services**, **VPC** and enable these two options on your default VPC ::: Click **Launch DB Instance** at the *bottom right* to launch your new instance. If all is successful, you should see the following. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/create-db-success.png) ## Connecting with ServiceStack.OrmLite Now that you're PostgreSQL instance is running, connecting with OrmLite will require the `ServiceStack.OrmLite.PostgreSQL` NuGet package as well as connection string to your new PostgreSQL instance. ``` xml <appSettings> <add key="ConnectionString" value="User ID={User};Password={Password};Host={Host};Port={Port};" /> </appSettings> ``` :::copy `<PackageReference Include="ServiceStack.OrmLite.PostgreSQL" Version="10.*" />` ::: Once this dependency is installed, the `OrmLiteConnectionFactory` can be used with the `PostgreSqlDialect.Provider` can be configured in the AppHost Configure method. For example. ``` csharp public class AppHost : AppSelfHostBase { public AppHost() : base("AWS PostgreSQL Customers", typeof(AppHost).Assembly) {} public override void Configure(Container container) { container.Register<IDbConnectionFactory>(c => new OrmLiteConnectionFactory( AppSettings.GetString("ConnectionString"), PostgreSqlDialect.Provider)); using (var db = container.Resolve<IDbConnectionFactory>().Open()) { if (db.CreateTableIfNotExists<Customer>()) { //Add seed data } } } } ``` Using our connection from a ServiceStack Service, we can use the `Db` property to access our `Customer` table. Eg, Below is an example of a CRUD service using OrmLite. ``` csharp public class CustomerService : Service { public object Get(GetCustomers request) { return new GetCustomersResponse { Results = Db.Select<Customer>() }; } public object Get(GetCustomer request) { return Db.SingleById<Customer>(request.Id); } public object Post(CreateCustomer request) { var customer = new Customer { Name = request.Name }; Db.Save(customer); return customer; } public object Put(UpdateCustomer request) { var customer = Db.SingleById<Customer>(request.Id); if (customer == null) throw HttpError.NotFound("Customer '{0}' does not exist".Fmt(request.Id)); customer.Name = request.Name; Db.Update(customer); return customer; } public void Delete(DeleteCustomer request) { Db.DeleteById<Customer>(request.Id); } } ``` See the [OrmLite GitHub](/ormlite/ormlite-apis) page for more info on working with OrmLite API. ## Getting Started with AWS RDS Aurora and OrmLite ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-aurora-powered-by-aws.png) ServiceStack.OrmLite library has support for use with an [Aurora](https://aws.amazon.com/rds/aurora/) database via the [`ServiceStack.OrmLite.MySql`](https://www.nuget.org/packages/ServiceStack.OrmLite.MySql/) NuGet package. This can be used in conjunction with Amazon's RDS service using Aurora. To get started, first you will need to create your Aurora database via the AWS RDS service. ## Creating an Aurora Instance 1. Login to the [AWS Web console](https://console.aws.amazon.com/console/home). 2. Select [RDS](https://console.aws.amazon.com/rds/home) from the **Services** from the top menu. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aws-rds-menu.png) 3. Select **Instances** from the **RDS Dashboard** and click **Launch DB Instance**. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/launch-db-dashboard.png) The above steps will start the RDS Wizard to launch a new DB instance. To setup a new Aurora instance, follow the wizard selecting the appropriate options for your application. As an example, we can create a `Customers` database for a non-production environment. - **Select Engine** - Select Amazon Aurora - **Specify DB Details** - Create a `db.r3.large` instance with default settings - Specify **Multi-AZ Deployment** as `No` ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aurora-default-details.png) - Specify **DB Instance Identifier**, eg `servicestack-example-customers`. - Specify **Master Username**, eg `admin`. - Create and confirm master user password. - **Configure Advanced Settings** - Leave the suggested settings and specify a **Database Name**, eg `customers`. This will be used in your connection string. ::: info Problems can occur if your default VPC is not setup to DNS Resolution and/or DNS Hostname. Navigate to **Services**, **VPC** and enable these two options on your default VPC. Default settings are to create a new VPC security group that will allow remote access to your DB instance based on your IP address. If your IP address changes, you will lose remote access and this security group will need to be updated ::: Click **Launch DB Instance** at the *bottom right* to launch your new instance. If all is successful, you should see the following. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/create-db-success.png) ## Connecting with ServiceStack.OrmLite Now that you're Aurora instance is running, connecting with OrmLite will require the `ServiceStack.OrmLite.MySql` NuGet package as well as connection string to your new Aurora instance. ``` xml <appSettings> <add key="ConnectionString" value="Uid={User};Password={Password};Server={EndpointUrl};Port={EndpointPort};Database=customers" /> </appSettings> ``` :::copy `<PackageReference Include="ServiceStack.OrmLite.MySql" Version="10.*" />` ::: Once this dependency is installed, the `OrmLiteConnectionFactory` can be used with the `MySqlDialect.Provider` can be configured in the AppHost Configure method. For example. ``` csharp public class AppHost : AppSelfHostBase { public AppHost() : base("AWS Aurora Customers", typeof(AppHost).Assembly) {} public override void Configure(Container container) { container.Register<IDbConnectionFactory>(c => new OrmLiteConnectionFactory( AppSettings.GetString("ConnectionString"), MySqlDialect.Provider)); using (var db = container.Resolve<IDbConnectionFactory>().Open()) { if (db.CreateTableIfNotExists<Customer>()) { //Add seed data } } } } ``` Using our connection from a ServiceStack Service, we can use the `Db` property to access our `Customer` table. Eg, Below is an example of a CRUD service using OrmLite. ``` csharp public class CustomerService : Service { public object Get(GetCustomers request) { return new GetCustomersResponse { Results = Db.Select<Customer>() }; } public object Get(GetCustomer request) { return Db.SingleById<Customer>(request.Id); } public object Post(CreateCustomer request) { var customer = new Customer { Name = request.Name }; Db.Save(customer); return customer; } public object Put(UpdateCustomer request) { var customer = Db.SingleById<Customer>(request.Id); if (customer == null) throw HttpError.NotFound("Customer '{0}' does not exist".Fmt(request.Id)); customer.Name = request.Name; Db.Update(customer); return customer; } public void Delete(DeleteCustomer request) { Db.DeleteById<Customer>(request.Id); } } ``` See the [OrmLite GitHub](/ormlite/ormlite-apis) page for more info on working with OrmLite API. ## Getting Started with AWS RDS MySQL and OrmLite ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-mysql-powered-by-aws.png) ServiceStack.OrmLite library has support for use with a [MySQL](https://www.mysql.com/) database via the [`ServiceStack.OrmLite.MySql`](https://www.nuget.org/packages/ServiceStack.OrmLite.MySql/) NuGet package. This can be used in conjunction with Amazon's RDS service using MySQL. To get started, first you will need to create your MySQL database via the AWS RDS service. ## Creating a MySQL RDS Instance 1. Login to the [AWS Web console](https://console.aws.amazon.com/console/home). 2. Select [RDS](https://console.aws.amazon.com/rds/home) from the **Services** from the top menu. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aws-rds-menu.png) 3. Select **Instances** from the **RDS Dashboard** and click **Launch DB Instance**. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/launch-db-dashboard.png) The above steps will start the RDS Wizard to launch a new DB instance. To setup a new MySQL instance, follow the wizard selecting the appropriate options for your application. As an example, we can create a `Customers` database for a non-production environment. - **Select Engine** - Select MySQL - **Production?** - Select `No` for multi-instance/production setup - **Specify DB Details** - Create a `db.t2.micro` instance with default settings - Specify **Multi-AZ Deployment** as `No` ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/mysql-default-details.png) - Specify **DB Instance Identifier**, eg `servicestack-example-customers`. - Specify **Master Username**, eg `admin`. - Create and confirm master user password. - **Configure Advanced Settings** - Leave the suggested settings and specify a database name, eg `customers`. This will be used in your connection string. ::: info Problems can occur if your default VPC is not setup to DNS Resolution and/or DNS Hostname. Navigate to **Services**, **VPC** and enable these two options on your default VPC ::: Click **Launch DB Instance** at the *bottom right* to launch your new instance. If all is successful, you should see the following. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/create-db-success.png) ## Connecting with ServiceStack.OrmLite Now that you're MySQL instance is running, connecting with OrmLite will require the `ServiceStack.OrmLite.MySql` NuGet package as well as connection string to your new MySQL instance. ``` xml <appSettings> <add key="ConnectionString" value="Uid={User};Password={Password};Server={EndpointUrl};Port={EndpointPort};Database=customers" /> </appSettings> ``` :::copy `<PackageReference Include="ServiceStack.OrmLite.MySql" Version="10.*" />` ::: Once this dependency is installed, the `OrmLiteConnectionFactory` can be used with the `MySqlDialect.Provider` can be configured in the AppHost Configure method. For example. ``` csharp public class AppHost : AppSelfHostBase { public AppHost() : base("AWS MySql Customers", typeof(AppHost).Assembly) {} public override void Configure(Container container) { container.Register<IDbConnectionFactory>(c => new OrmLiteConnectionFactory( AppSettings.GetString("ConnectionString"), MySqlDialect.Provider)); using (var db = container.Resolve<IDbConnectionFactory>().Open()) { if (db.CreateTableIfNotExists<Customer>()) { //Add seed data } } } } ``` Using our connection from a ServiceStack Service, we can use the `Db` property to access our `Customer` table. Eg, Below is an example of a CRUD service using OrmLite. ``` csharp public class CustomerService : Service { public object Get(GetCustomers request) { return new GetCustomersResponse { Results = Db.Select<Customer>() }; } public object Get(GetCustomer request) { return Db.SingleById<Customer>(request.Id); } public object Post(CreateCustomer request) { var customer = new Customer { Name = request.Name }; Db.Save(customer); return customer; } public object Put(UpdateCustomer request) { var customer = Db.SingleById<Customer>(request.Id); if (customer == null) throw HttpError.NotFound("Customer '{0}' does not exist".Fmt(request.Id)); customer.Name = request.Name; Db.Update(customer); return customer; } public void Delete(DeleteCustomer request) { Db.DeleteById<Customer>(request.Id); } } ``` See the [OrmLite GitHub](/ormlite/ormlite-apis) page for more info on working with OrmLite API. ## Getting Started with AWS RDS MariaDB and OrmLite ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-mariadb-powered-by-aws.png) ServiceStack.OrmLite library has support for use with a [MariaDB](https://mariadb.org/) database via the [`ServiceStack.OrmLite.MySql`](https://www.nuget.org/packages/ServiceStack.OrmLite.MySql/) NuGet package. This can be used in conjunction with Amazon's RDS service using MariaDB. ::: info MariaDB is a "binary drop in replacement for MySQL" which is why the `ServiceStack.OrmLite.MySql` NuGet package can be used. For more information, see the [MariaDB documentation](https://mariadb.com/kb/en/mariadb/mariadb-vs-mysql-compatibility/) ::: To get started, first you will need to create your MariaDB database via the AWS RDS service. ## Creating a MariaDB RDS Instance 1. Login to the [AWS Web console](https://console.aws.amazon.com/console/home). 2. Select [RDS](https://console.aws.amazon.com/rds/home) from the **Services** from the top menu. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aws-rds-menu.png) 3. Select **Instances** from the **RDS Dashboard** and click **Launch DB Instance**. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/launch-db-dashboard.png) The above steps will start the RDS Wizard to launch a new DB instance. To setup a new MariaDB instance, follow the wizard selecting the appropriate options for your application. As an example, we can create a `Customers` database for a non-production environment. - **Select Engine** - Select MariaDB - **Production?** - Select `No` for multi-instance/production setup - **Specify DB Details** - Create a `db.t2.micro` instance with default settings - Specify **Multi-AZ Deployment** as `No` ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/mariadb-default-details.png) - Specify **DB Instance Identifier**, eg `servicestack-example-customers`. - Specify **Master Username**, eg `admin`. - Create and confirm master user password. - **Configure Advanced Settings** - Leave the suggested settings and specify a database name, eg `customers`. This will be used in your connection string. ::: info Problems can occur if your default VPC is not setup to DNS Resolution and/or DNS Hostname. Navigate to **Services**, **VPC** and enable these two options on your default VPC ::: Click **Launch DB Instance** at the *bottom right* to launch your new instance. If all is successful, you should see the following. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/create-db-success.png) ## Connecting with ServiceStack.OrmLite Now that you're MariaDB instance is running, connecting with OrmLite will require the `ServiceStack.OrmLite.MySql` NuGet package as well as connection string to your new MariaDB instance. ``` xml <appSettings> <add key="ConnectionString" value="Uid={User};Password={Password};Server={EndpointUrl};Port={EndpointPort};Database=customers" /> </appSettings> ``` :::copy `<PackageReference Include="ServiceStack.OrmLite.MySql" Version="10.*" />` ::: Once this dependency is installed, the `OrmLiteConnectionFactory` can be used with the `MySqlDialect.Provider` can be configured in the AppHost Configure method. For example. ``` csharp public class AppHost : AppSelfHostBase { public AppHost() : base("AWS MariaDB Customers", typeof(AppHost).Assembly) {} public override void Configure(Container container) { container.Register<IDbConnectionFactory>(c => new OrmLiteConnectionFactory( AppSettings.GetString("ConnectionString"), MySqlDialect.Provider)); using (var db = container.Resolve<IDbConnectionFactory>().Open()) { if (db.CreateTableIfNotExists<Customer>()) { //Add seed data } } } } ``` Using our connection from a ServiceStack Service, we can use the `Db` property to access our `Customer` table. Eg, Below is an example of a CRUD service using OrmLite. ``` csharp public class CustomerService : Service { public object Get(GetCustomers request) { return new GetCustomersResponse { Results = Db.Select<Customer>() }; } public object Get(GetCustomer request) { return Db.SingleById<Customer>(request.Id); } public object Post(CreateCustomer request) { var customer = new Customer { Name = request.Name }; Db.Save(customer); return customer; } public object Put(UpdateCustomer request) { var customer = Db.SingleById<Customer>(request.Id); if (customer == null) throw HttpError.NotFound("Customer '{0}' does not exist".Fmt(request.Id)); customer.Name = request.Name; Db.Update(customer); return customer; } public void Delete(DeleteCustomer request) { Db.DeleteById<Customer>(request.Id); } } ``` See the [OrmLite GitHub](/ormlite/ormlite-apis) page for more info on working with OrmLite API. ## Getting Started with AWS RDS SQL Server and OrmLite ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-sqlserver-powered-by-aws.png) ServiceStack.OrmLite library has support for use with a [Microsoft SQL Server](http://www.microsoft.com/en-au/server-cloud/products/sql-server/) database via the [`ServiceStack.OrmLite.SqlServer`](https://www.nuget.org/packages/ServiceStack.OrmLite.SqlServer/) NuGet package. This can be used in conjunction with Amazon's RDS service using SQL Server. To get started, first you will need to create your SQL Server database via the AWS RDS service. ## Creating a SQL Server RDS Instance 1. Login to the [AWS Web console](https://console.aws.amazon.com/console/home). 2. Select [RDS](https://console.aws.amazon.com/rds/home) from the **Services** from the top menu. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aws-rds-menu.png) 3. Select **Instances** from the **RDS Dashboard** and click **Launch DB Instance**. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/launch-db-dashboard.png) The above steps will start the RDS Wizard to launch a new DB instance. To setup a new SQL Server instance, follow the wizard selecting the appropriate options for your application. As an example, we can create a `Customers` database for a non-production environment. - **Select Engine** - Select **SQL Server** - Select appropriate SQL Server version, for this example, **SQL Server SE** - **Specify DB Details** - Select **License Model** `license-included` - Create a `db.m1.small` instance with default settings by changing the **DB Instance Class**. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/mssql-default-details.png) - Specify **DB Instance Identifier**, eg `customers`. - Specify **Master Username**, eg `admin`. - Create and confirm master user password. - **Configure Advanced Settings** - Leave the suggested settings which will create your RDS instance with network rule that restricts public access via your current public IP address. ::: info Problems can occur if your default VPC is not setup to DNS Resolution and/or DNS Hostname. Navigate to **Services**, **VPC** and enable these two options on your default VPC ::: Click **Launch DB Instance** at the *bottom right* to launch your new instance. If all is successful, you should see the following. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/create-db-success.png) ## Connecting with ServiceStack.OrmLite Now that you're SQL Server instance is running, connecting with OrmLite will require the `ServiceStack.OrmLite.SqlServer` NuGet package as well as connection string to your new SQL Server instance. ::: info If you are connecting to a new instance without a database, you'll need to create a new Database via SQL Management Studio first. For this example the `customers` database was created ::: ``` xml <appSettings> <add key="ConnectionString" value="Data Source={Endpoint},{Port};Initial Catalog=customers;User ID={User};Password={Password}" /> </appSettings> ``` :::copy `<PackageReference Include="ServiceStack.SqlServer" Version="10.*" />` ::: Once this dependency is installed, the `OrmLiteConnectionFactory` can be used with the `SqlServerDialect.Provider` can be configured in the AppHost Configure method. For example. ``` csharp public class AppHost : AppSelfHostBase { public AppHost() : base("AWS SQL Server Customers", typeof(AppHost).Assembly) {} public override void Configure(Container container) { container.Register<IDbConnectionFactory>(c => new OrmLiteConnectionFactory( AppSettings.GetString("ConnectionString"), SqlServerDialect.Provider)); using (var db = container.Resolve<IDbConnectionFactory>().Open()) { if (db.CreateTableIfNotExists<Customer>()) { //Add seed data } } } } ``` Using our connection from a ServiceStack Service, we can use the `Db` property to access our `Customer` table. Eg, Below is an example of a CRUD service using OrmLite. ``` csharp public class CustomerService : Service { public object Get(GetCustomers request) { return new GetCustomersResponse { Results = Db.Select<Customer>() }; } public object Get(GetCustomer request) { return Db.SingleById<Customer>(request.Id); } public object Post(CreateCustomer request) { var customer = new Customer { Name = request.Name }; Db.Save(customer); return customer; } public object Put(UpdateCustomer request) { var customer = Db.SingleById<Customer>(request.Id); if (customer == null) throw HttpError.NotFound("Customer '{0}' does not exist".Fmt(request.Id)); customer.Name = request.Name; Db.Update(customer); return customer; } public void Delete(DeleteCustomer request) { Db.DeleteById<Customer>(request.Id); } } ``` See the [OrmLite GitHub](/ormlite/ormlite-apis) page for more info on working with OrmLite API. ## Getting started with AWS ElastiCache Redis and ServiceStack ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticache-redis-powered-by-aws.png) Amazon's 'ElastiCache' allows a simple way to create and manage cache instances that can be simply incorporated into your ServiceStack application stack using the ServiceStack Redis client, `ServiceStack.Redis`. #### Creating an ElastiCache Cluster 1. Login to the [AWS Web console](https://console.aws.amazon.com/console/home). 2. Select [ElastiCache](https://console.aws.amazon.com/elasticache/home) from the **Services** from the top menu. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aws-services-menu-elasticcache.png) 3. Select **Get Started Now** or **ElasticCache Dashboard** and **Launch Cache Cluster** 4. Select **Redis** for the cluster engine. You can run your cache as a single Redis node or add multiple nodes for additional redundency. In this example, we will be using 3 nodes. One as a primary (or master) node and 2 read only replicas (or slaves). ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticcache-redis-config.png) ::: info To use the smaller instances like the `cache.t2.micro`, **Multi-AZ** must be disabled ::: So you're EC2 instance can access your Redis nodes, ensure you select a **VPC Security Group** that exposes the default port `6379`. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticcache-redis-adv.png) ::: info If you haven't already setup a security group exposing this port, you'll need to create one by [managing your VPC security groups](https://console.aws.amazon.com/vpc/home#securityGroups:) ::: To finish, reviewed your settings and click **Launch Replication Group**. ## Enable Caching with ServiceStack.Redis Now you're your Redis nodes are ready, your AppHost can be configured to use them when deployed. AWS **does not allow external access** to ElastiCache servers, so they can only be used when your ServiceStack application is deployed. First, you'll need to install `ServiceStack.Redis` NuGet package if your application doesn't already use it. :::copy `<PackageReference Include="ServiceStack.Redis" Version="10.*" />` ::: In this example, we are going to use a `PooledRedisClientManager` for our `IRedisClientsManager`. This will be responsible for creating `ICacheClient`s that our `Service`s will use to connect to the ElastiCache nodes. We will need to provide our `PooledRedisClientManager` with the nodes we have create. For example, as shown above, we created a cluster of **1 Primary** (master) and **2 Read Replicas** (slaves), these endpoint URLs can be accessed from the ElastiCache **Dashboard**. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticcache-redis-nodes.png) Below is a simple example of a configured self hosting AppHost that uses ElastiCache for caching when deployed and an in memory caching when developing locally. ``` csharp public class AppHost : AppSelfHostBase { public AppHost() : base("AWS Redis ElastiCache Example", typeof(MyServices).Assembly) { } public override void Configure(Container container) { //Your DB initialization ... // AWS ElastiCache servers are NOT accessible from outside AWS // Use MemoryCacheClient locally if (AppSettings.GetString("Environment") == "Production") { container.Register<IRedisClientsManager>(c => new PooledRedisClientManager( // Primary node from AWS (master) AwsElastiCacheConfig.MasterNodes, // Read replica nodes from AWS (slaves) AwsElastiCacheConfig.SlaveNodes)); container.Register<ICacheClient>(c => container.Resolve<IRedisClientsManager>().GetCacheClient()); } else { container.Register<ICacheClient>(new MemoryCacheClient()); } } } ``` With configuration provided in your application config. ``` xml <appSettings> <add key="Environment" value="Production"/> <add key="MasterNodes" value="{YourAWSPrimaryNodeAddress}"/> <add key="SlaveNodes" value="{Your1stAWSReadReplicaNodeAddress},{AWSReadReplicaNodeAddress}"/> </appSettings> ``` Now that your caching is setup and connecting, you can cache your web servie responses easily by returning `Request.ToOptimizedResultUsingCache` from within a ServiceStack `Service`. For example, returning a full customers details might be an expensive database query. We can cache the result in the ElastiCache cluster for a faster response and invalidate the cache when the details are updated. ``` csharp public class CustomerService : Service { private static string CacheKey = "customer_details_{0}"; public object Get(GetCustomer request) { return this.Request.ToOptimizedResultUsingCache(this.Cache, CacheKey.Fmt(request.Id), () => { Thread.Sleep(500); //Long request return new GetCustomerResponse { Result = this.Db.LoadSingleById<Customer>(request.Id) }; }); } public object Put(UpdateCustomer request) { var customer = this.Db.LoadSingleById<Customer>(request.Id); customer = customer.PopulateWith(request.ConvertTo<Customer>()); this.Db.Update(customer); //Invalidate customer details cache this.Cache.ClearCaches(CacheKey.Fmt(request.Id)); return new UpdateCustomerResponse() { Result = customer }; } } ``` ::: info As this example uses the `Cache` property from the `Service` in a distributed cache environment, `Cache.Get<T>` values are coming from the read replica (slave) instances which will take time to replicate from a previous `Cache.Set<T>` call. To gaurantee a value is immediately available, reusing the same instance can be done by handling the creating from the `IRedisClientsManager` from within your `Service` method ::: ``` csharp using var cache = this.RedisClientManager.GetClient(); //Your Cache Client code ``` ## Getting started with AWS ElastiCache and ServiceStack ### Memcached ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticache-memcached-powered-by-aws.png) Amazon's 'ElastiCache' allows a simple way to create and manage Memcached instances that can be simply incorporated into your ServiceStack application stack using the ServiceStack NuGet package, `ServiceStack.Caching.Memcached`. #### Creating an ElastiCache Cluster 1. Login to the [AWS Web console](https://console.aws.amazon.com/console/home). 2. Select [ElastiCache](https://console.aws.amazon.com/elasticache/home) from the **Services** from the top menu. ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/aws-services-menu-elasticcache.png) 3. Select **Get Started Now** or **ElasticCache Dashboard** and **Launch Cache Cluster** 4. Select **Memcached** for the cluster engine. ElastiCache setup allows you to specify how many nodes you want in your cache cluster. In this example, we will be using 3. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticcache-memcached-config.png) So you're EC2 instance can access your Memcached cluster, ensure you select a **VPC Security Group** that exposes the default port `11211`. ![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticcache-memcached-adv.png) ::: info If you haven't already setup a security group exposing this port, you'll need to create one by [managing your VPC security groups](https://console.aws.amazon.com/vpc/home#securityGroups:) ::: To finish, reviewed your settings and click **Launch Cache Cluster**. ## Enable Caching in your ServiceStack application Now you're your Memcached cluster is ready, your AppHost can be configured to use it when deployed. AWS **does not allow external access** to ElastiCache servers, so they can only be used when your ServiceStack application is deployed. First, you'll need to install `ServiceStack.Caching.Memcached`: :::copy `<PackageReference Include="ServiceStack.Caching.Memcached" Version="10.*" />` ::: To access the Memcached nodes from your `Service`s, you will need to register a `MemcachedClientCache` as a `ICacheClient` with the IoC container. This client has to initialized with each of the node endpoints provided by AWS. From the [ElastiCache Dashboard](https://console.aws.amazon.com/elasticache/home), click on the `nodes` on your cluster to see the node endpoint URLs. ![Memcached cluster view from Dashboard](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticcache-memcached-nodes.png) This will show all the nodes in the cluster. For example. ![Listed node endpoints](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticcache-memcached-node-urls.png) Below is a simple example of a configured self hosting AppHost that uses ElastiCache for caching when deployed and an in memory caching when developing locally. ``` csharp public class AppHost : AppSelfHostBase { public AppHost() : base("AWS Memcached ElastiCache Example", typeof(MyServices).Assembly) {} public override void Configure(Container container) { //Your DB initialization ... // AWS ElastiCache servers are NOT accessible from outside AWS // Use MemoryCacheClient locally if (AppSettings.GetString("Environment") == "Production") { container.Register<ICacheClient>(c => new MemcachedClientCache( AwsElastiCacheConfig.MemcachedNodes)); } else { container.Register<ICacheClient>(new MemoryCacheClient()); } } } ``` With configuration provided in your application config. ``` xml <appSettings> <add key="Environment" value="Production"/> <add key="MemcachedNodes" value="{MemcachedNodeAddress1},{MemcachedNodeAddress2}"/> </appSettings> ``` Now that your caching is setup and connecting, you can cache your web service responses easily by returning `Request.ToOptimizedResultUsingCache` from within a ServiceStack `Service`. For example, returning a full customers details might be an expensive database query. We can cache the result in the ElastiCache cluster for a faster response and invalidate the cache when the details are updated. ``` csharp public class CustomerService : Service { private static string CacheKey = "customer_details_{0}"; public object Get(GetCustomer request) { return this.Request.ToOptimizedResultUsingCache(this.Cache, CacheKey.Fmt(request.Id), () => { Thread.Sleep(500); //Long request return new GetCustomerResponse { Result = this.Db.LoadSingleById<Customer>(request.Id) }; }); } public object Put(UpdateCustomer request) { var customer = this.Db.LoadSingleById<Customer>(request.Id); customer = customer.PopulateWith(request.ConvertTo<Customer>()); this.Db.Update(customer); //Invalidate customer details cache this.Cache.ClearCaches(CacheKey.Fmt(request.Id)); return new UpdateCustomerResponse() { Result = customer }; } } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/aws-pocodynamo.md --- title: PocoDynamo --- is a highly productive, feature-rich, typed .NET client which extends [ServiceStack's Simple POCO life](http://stackoverflow.com/a/32940275/85785) by enabling re-use of your code-first data models with Amazon's industrial strength and highly-scalable NoSQL [DynamoDB](https://aws.amazon.com/dynamodb/). ## First class support for reusable, code-first POCOs PocoDynamo is conceptually similar to ServiceStack's other code-first [OrmLite](/ormlite/) and [Redis](https://github.com/ServiceStack/ServiceStack.Redis) clients by providing a high-fidelity, managed client that enhances AWSSDK's low-level [IAmazonDynamoDB client](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/UsingAWSsdkForDotNet.html), with rich, native support for intuitively mapping your re-usable code-first POCO Data models into [DynamoDB Data Types](http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_Types.html). ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/related-customer.png) ### [AutoQuery DynamoDB](/aws-pocodynamo) Built on top of PocoDynamo, [AutoQuery Data's](/autoquery/data) `DynamoDbSource` provides the most productive development experience for effortlessly creating rich, queryable and optimized Services for DynamoDB data stores using only a typed Request DTO. ### Quick Preview A quick CRUD preview of **PocoDynamo** feature-rich high-level Typed client: ```csharp using System; using Amazon; using Amazon.DynamoDBv2; using ServiceStack; using ServiceStack.Text; using ServiceStack.Aws.DynamoDb; using ServiceStack.DataAnnotations; var awsDb = new AmazonDynamoDBClient("keyId","key", new AmazonDynamoDBConfig { ServiceURL="http://localhost:8000"}); var db = new PocoDynamo(awsDb); public class Todo { [AutoIncrement] public long Id { get; set; } public string Content { get; set; } public int Order { get; set; } public bool Done { get; set; } } db.RegisterTable<Todo>(); db.DeleteTable<Todo>(); // Delete existing Todo Table (if any) db.InitSchema(); // Creates Todo DynamoDB Table var newTodo = new Todo { Content = "Learn PocoDynamo", Order = 1 }; db.PutItem(newTodo); var savedTodo = db.GetItem<Todo>(newTodo.Id); "Saved Todo: {0}".Print(savedTodo.Dump()); savedTodo.Done = true; db.PutItem(savedTodo); var updatedTodo = db.GetItem<Todo>(newTodo.Id); "Updated Todo: {0}".Print(updatedTodo.Dump()); db.DeleteItem<Todo>(newTodo.Id); var remainingTodos = db.GetAll<Todo>(); "No more Todos: {0}".Print(remainingTodos.Dump()); ``` ## Features #### Advanced idiomatic .NET client PocoDynamo provides an idiomatic API that leverages .NET advanced language features with streaming API's returning `IEnumerable<T>` lazily evaluated responses that transparently performs multi-paged requests behind-the-scenes whilst the resultset is iterated. It high-level API's provides a clean lightweight adapter to transparently map between .NET built-in data types and DynamoDB's low-level attribute values. Its efficient batched API's take advantage of DynamoDB's `BatchWriteItem` and `BatchGetItem` batch operations to perform the minimum number of requests required to implement each API. #### Typed, LINQ provider for Query and Scan Operations PocoDynamo also provides rich, typed LINQ-like querying support for constructing DynamoDB Query and Scan operations, dramatically reducing the effort to query DynamoDB, enhancing readability whilst benefiting from Type safety in .NET. #### Declarative Tables and Indexes Behind the scenes DynamoDB is built on a dynamic schema which whilst open and flexible, can be cumbersome to work with directly in typed languages like C#. PocoDynamo bridges the gap and lets your app bind to impl-free and declarative POCO data models that provide an ideal high-level abstraction for your business logic, hiding a lot of the complexity of working with DynamoDB - dramatically reducing the code and effort required whilst increasing the readability and maintainability of your Apps business logic. It includes optimal support for defining simple local indexes which only require declaratively annotating properties to index with an `[Index]` attribute. Typed POCO Data Models can be used to define more complex Local and Global DynamoDB Indexes by implementing `IGlobalIndex<Poco>` or `ILocalIndex<Poco>` interfaces which PocoDynamo uses along with the POCOs class structure to construct Table indexes at the same time it creates the tables. In this way the Type is used as a DSL to define DynamoDB indexes where the definition of the index is decoupled from the imperative code required to create and query it, reducing the effort to create them whilst improving the visualization and understanding of your DynamoDB architecture which can be inferred at a glance from the POCO's Type definition. PocoDynamo also includes first-class support for constructing and querying Global and Local Indexes using a familiar, typed LINQ provider. #### Resilient Each operation is called within a managed execution which transparently absorbs the variance in cloud services reliability with automatic retries of temporary errors, using an exponential backoff as recommended by Amazon. #### Enhances existing APIs PocoDynamo API's are a lightweight layer modeled after DynamoDB API's making it predictable the DynamoDB operations each API calls under the hood, retaining your existing knowledge investment in DynamoDB. When more flexibility is needed you can access the low-level `AmazonDynamoDBclient from the `IPocoDynamo.DynamoDb` property and talk with it directly. Whilst PocoDynamo doesn't save you for needing to learn DynamoDB, its deep integration with .NET and rich support for POCO's smoothes out the impedance mismatches to enable an type-safe, idiomatic, productive development experience. #### High-level features PocoDynamo includes its own high-level features to improve the re-usability of your POCO models and the development experience of working with DynamoDB with support for Auto Incrementing sequences, Query expression builders, auto escaping and converting of Reserved Words to placeholder values, configurable converters, scoped client configurations, related items, conventions, aliases, dep-free data annotation attributes and more. ## Download PocoDynamo is contained in ServiceStack's AWS NuGet package: :::copy `<PackageReference Include="ServiceStack.Aws" Version="10.*" />` ::: > PocoDynamo has a 10 Tables [free-quota usage](https://servicestack.net/download#free-quotas) limit which can be unlocked with a [commercial license key](https://servicestack.net/pricing). To get started we'll need to create an instance of `AmazonDynamoDBClient` with your AWS credentials and Region info: ```csharp var awsDb = new AmazonDynamoDBClient(AWS_ACCESS_KEY, AWS_SECRET_KEY, RegionEndpoint.USEast1); ``` Then to create a PocoDynamo client pass the configured AmazonDynamoDBClient instance above: ```csharp var db = new PocoDynamo(awsDb); ``` ::: info Clients are Thread-Safe so you can register them as a singleton and share the same instance throughout your App ::: ### [Source Code](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb) The Source Code for PocoDynamo is maintained in [ServiceStack.Aws](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack.Aws) repository. ### Download Local DynamoDB It's recommended to download [local DynamoDB](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Tools.DynamoDBLocal.html#Tools.DynamoDBLocal.DownloadingAndRunning) as it lets you develop against a local DynamoDB instance, saving you needing a network connection or AWS account. You can connect to your local DynamoDB instance by configuring the `AmazonDynamoDBClient` to point to the default url where Local DynamoDB instance is running: ```csharp var awsDb = new AmazonDynamoDBClient("keyId", "key", new AmazonDynamoDBConfig { ServiceURL = "http://localhost:8000", }); var db = new PocoDynamo(awsDb); ``` We've found the latest version of Local DynamoDB to be a robust and fast substitute for AWS, that eliminates waiting times for things like creating and dropping tables whilst only slightly deviating from the capabilities of AWS where it doesn't always include the additional limitations imposed when hosted on AWS. ## Usage To illustrate how PocoDynamo simplifies working with DynamoDB, we'll walk-through creating and retrieving the Simple [Todo model](https://github.com/ServiceStackApps/AwsApps/blob/04dea6472fd73ea2e55f1aa748fff6e8784b339c/src/AwsApps/todo/TodoService.cs#L9) used in the DynamoDB-powered AWS Todo Example and compare it against the code required when using AWSSDK's `IAmazonDynamoDB` client directly. The simple `Todo` POCO is the same data model used to store TODO's in every major RDBMS's with [OrmLite](/ormlite/), in Redis with [ServiceStack.Redis](/redis/) as well as every supported [Caching provider](/caching). PocoDynamo increases the re-use of `Todo` again which can now be used to store TODO's in DynamoDB as well: ```csharp public class Todo { [AutoIncrement] public long Id { get; set; } public string Content { get; set; } public int Order { get; set; } public bool Done { get; set; } } ``` ### Creating a Table with PocoDynamo PocoDynamo enables a declarative code-first approach where it's able to create DynamoDB Table schemas from just your POCO class definition. Whilst you could call `db.CreateTable<Todo>()` API and create the Table directly, the recommended approach is instead to register all the tables your App uses with PocoDynamo on Startup, then just call `InitSchema()` which will go through and create all missing tables: ```csharp //PocoDynamo var db = new PocoDynamo(awsDb) .RegisterTable<Todo>(); db.InitSchema(); db.GetTableNames().PrintDump(); ``` In this way your App ends up in the same state with all tables created if it was started with **no tables**, **all tables** or only a **partial list** of tables. After the tables are created we query DynamoDB to dump its entire list of Tables, which if you started with an empty DynamoDB instance would print the single **Todo** table name to the Console: ```js [ Todo ] ``` ### Complete PocoDynamo TODO example Before going through the details of how it all works under-the-hood, here's a quick overview of what it looks likes to use PocoDynamo for developing a simple CRUD App. The ServiceStack [TodoService](https://github.com/ServiceStackApps/AwsApps/blob/master/src/AwsApps/todo/TodoService.cs) below contains the full server implementation required to implement the REST API to power [Backbone's famous TODO App](http://todomvc.com/examples/backbone/), rewritten to store all TODO items in DynamoDB: ```csharp //PocoDynamo public class TodoService : Service { public IPocoDynamo Dynamo { get; set; } public object Get(Todo todo) { if (todo.Id != default(long)) return Dynamo.GetItem<Todo>(todo.Id); return Dynamo.GetAll<Todo>(); } public Todo Post(Todo todo) { Dynamo.PutItem(todo); return todo; } public Todo Put(Todo todo) { return Post(todo); } public void Delete(Todo todo) { Dynamo.DeleteItem<Todo>(todo.Id); } } ``` We can see `IPocoDynamo` is just a normal IOC dependency that provides high-level API's that work directly with POCO's and built-in .NET data types, enabling the minimum effort to store, get and delete data from DynamoDB. ### Creating a DynamoDB Table using AmazonDynamoDBClient The equivalent imperative code to create the Todo DynamoDB table above would require creating executing the `CreateTableRequest` below: ```csharp //AWSSDK var request = new CreateTableRequest { TableName = "Todo", KeySchema = new List<KeySchemaElement> { new KeySchemaElement("Id", KeyType.HASH), }, AttributeDefinitions = new List<AttributeDefinition> { new AttributeDefinition("Id", ScalarAttributeType.N), }, ProvisionedThroughput = new ProvisionedThroughput { ReadCapacityUnits = 10, WriteCapacityUnits = 5, } }; awsDb.CreateTable(request); ``` DynamoDB Tables take a little while to create in AWS so we can't use it immediately, instead you'll need to periodically poll to check the status for when it's ready: ```csharp //AWSSDK var startAt = DateTime.UtcNow; var timeout = TimeSpan.FromSeconds(60); do { try { var descResponse = awsDb.DescribeTable("Todo"); if (descResponse.Table.TableStatus == DynamoStatus.Active) break; Thread.Sleep(TimeSpan.FromSeconds(2)); } catch (ResourceNotFoundException) { // DescribeTable is eventually consistent. So you might get resource not found. } if (DateTime.UtcNow - startAt > timeout) throw new TimeoutException("Exceeded timeout of {0}".Fmt(timeout)); } while (true); ``` Once the table is Active we can start using it, to get the list of table names we send a `ListTablesRequest`: ```csharp //AWSSDK var listResponse = awsDb.ListTables(new ListTablesRequest()); var tableNames = listResponse.TableNames; tableNames.PrintDump(); ``` ## Managed DynamoDB Client As we can see using the `AmazonDynamoDBClient` directly requires a lot more imperative code, but it also ends up doing a lot less. We've not included the logic to query existing tables so only the missing tables are created, we've not implemented any error handling or Retry logic (important for Cloud Services) and we're not checking to make sure we've collected the entire list of results (implementing paging when necessary). Whereas every request in PocoDynamo is invoked inside a managed execution where any temporary errors are retried using the [AWS recommended retries exponential backoff](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/ErrorHandling.html#APIRetries). All PocoDynamo API's returning `IEnumerable<T>` returns a lazy evaluated stream which behind-the-scenes sends multiple paged requests as needed whilst the sequence is being iterated. As LINQ API's are also lazily evaluated you could use `Take()` to only download the exact number results you need. So you can query the first 100 table names with: ```csharp //PocoDynamo var first100TableNames = db.GetTableNames().Take(100).ToList(); ``` and PocoDynamo will only make the minimum number of requests required to fetch the first 100 results. ## AutoIncrement Primary Keys Once the `Todo` table is created we can start adding TODOs to it. If we were using OrmLite. the `[AutoIncrement]` attribute lets us use the RDBMS's native support for auto incrementing sequences to populate the Id primary key. Unfortunately DynamoDB lacks an auto increment feature and instead recommends the user to supply a unique key as shown in their [DynamoDB Forum example](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DataModel.html) where they've chosen a Forum Name as the Hash Key of the Forum and Thread tables, whilst the Reply comment uses a concatenation of `ForumName` + `#` + `ThreadSubject` as its Hash Key and the `ReplyDateTime` for the Range Key. However auto incrementing Ids have a number of useful properties making it ideal for identifying data: - **Unique** - Each new item is guaranteed to have a unique Id that's higher than all Ids before it - **Sequential** - A useful property to ensure consistent results when paging or ordering - **Never change** - To ensure a constant key that never changes, Ids shouldn't contain data it references - **Easy to read** - Humans have a better chance to read and remember a number than a concatenated string - **Easy to reference** - It's easier to reference a predictable numeric field than a concatenated string They're also more re-usable as most data stores have native support for integer primary keys. For these reasons we've added support for Auto-Incrementing integer primary keys in PocoDynamo where Ids annotated with `[AutoIncrement]` attribute are automatically populated with the next id in its sequence. #### ISequenceSource The Auto Incrementing functionality is provided by the [ISequenceSource](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/ISequenceSource.cs) interface: ```csharp public interface ISequenceSource : IRequiresSchema { long Increment(string key, int amount = 1); void Reset(string key, int startingAt = 0); } ``` #### DynamoDbSequenceGenerator The default implementation uses [DynamoDbSequenceGenerator](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbSequenceGenerator.cs) which stores sequences for each table in the `Seq` DynamoDB Table so no additional services are required. To ensure unique incrementing sequences in DynamoDB, PocoDynamo uses UpdateItemRequest's `AttributeValueUpdate` feature to perform atomic value updates. PocoDynamo sequences are also very efficient and only require a single DynamoDB call to populate a batch of Primary Key Ids which are also guaranteed to be in order (and without gaps) for batches that are stored together. #### RedisSequenceSource If preferred you can instead instruct PocoDynamo to maintain sequences in Redis using [RedisSequenceSource](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Server/RedisSequenceSource.cs) or alternatively inject your own implementation which can be configured in PocoDynamo with: ```csharp var db = new PocoDynamo(awsDb) { Sequences = new RedisSequenceSource(redisManager), }; ``` ## Putting items with PocoDynamo As we can take advantage of Auto Incrementing Id's, storing Items becomes as simple as creating a number of POCO's and calling PutItems: ```csharp //PocoDynamo var todos = 100.Times(i => new Todo { Content = "TODO " + i, Order = i }); db.PutItems(todos); ``` ## Putting items with AmazonDynamoDBClient To do this manually with `AmazonDynamoDBClient` you'd need to create and `UpdateItemRequest` to update the counter maintaining your TODO sequences: ```csharp //AWSSDK var incrRequest = new UpdateItemRequest { TableName = "Seq", Key = new Dictionary<string, AttributeValue> { {"Id", new AttributeValue { S = "Todo" } } }, AttributeUpdates = new Dictionary<string, AttributeValueUpdate> { { "Counter", new AttributeValueUpdate { Action = AttributeAction.ADD, Value = new AttributeValue { N = "100" } } } }, ReturnValues = ReturnValue.ALL_NEW, }; var response = awsDb.UpdateItem(incrRequest); var nextSequences = Convert.ToInt64(response.Attributes["Counter"].N); ``` After you know which sequence to start with you can start putting items using a Dictionary of Attribute Values: ```csharp //AWSSDK for (int i = 0; i < 100; i++) { var putRequest = new PutItemRequest("Todo", new Dictionary<string, AttributeValue> { { "Id", new AttributeValue { N = (nextSequences - 100 + i).ToString() } }, { "Content", new AttributeValue("TODO " + i) }, { "Order", new AttributeValue { N = i.ToString() } }, { "Done", new AttributeValue { BOOL = false } }, }); awsDb.PutItem(putRequest); } ``` Although even without the managed execution this still isn't equivalent to PocoDynamo's example above as to store multiple items efficiently PocoDynamo `PutItems()` API batches multiple Items in 4x `BatchWriteItemRequest` behind-the-scenes, the minimum number needed due to DynamoDB's maximum Write Batch size limit of 25 requests. ## Getting Items with PocoDynamo Getting an item just requires the Generic Type and the primary key of the item to fetch: ```csharp var todo = db.GetItem<Todo>(1); todo.PrintDump(); ``` Which returns the Todo item if it exists, or `null` if it doesn't. Fetching all table items is where an understanding of DynamoDB's architecture and its limits become important. DynamoDB achieves its scalability by partitioning your data across multiple partitions based on its hash Key (aka Primary Key). This means that the only way to efficiently query across data containing multiple primary keys is to either explicitly create a [Global Secondary Index](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GSI.html) or perform a full-table Scan. However table scans in DynamoDB are more inefficient than full table scans in RDBMS's since it has to scan across multiple partitions which can quickly use up your table's provisioned throughput, as such scans should be limited to low usage areas. With that said, you can do Table Scans in PocoDynamo using API's starting with `Scan*` prefix, e.g. to return all Todo items: ```csharp //PocoDynamo IEnumerable<Todo> todos = db.ScanAll<Todo>(); ``` As IEnumerable's are lazily executed, it only starts sending `ScanRequest` to fetch all Items once the IEnumerable is iterated, which it does in **batches of 1000** (configurable with `PocoDynamo.PagingLimit`). To fetch all items you can just call `ToList()`: ```csharp var allTodos = todos.ToList(); allTodos.PrintDump(); ``` Which incidentally is also just what `db.GetAll<Todo>()` does. ## Getting Items with AWSSDK To fetch the same single item with the AWSSDK client you'd construct and send a `GetItemRequest`, e.g: ```csharp //AWSSDK var request = new GetItemRequest { TableName = "Todo", Key = new Dictionary<string, AttributeValue> { { "Id", new AttributeValue { N = "1"} } }, ConsistentRead = true, }; var response = awsDb.GetItem(request); var todo = new Todo { Id = Convert.ToInt64(response.Item["Id"].N), Content = response.Item["Content"].S, Order = Convert.ToInt32(response.Item["Order"].N), Done = response.Item["Done"].BOOL, }; ``` Although this is a little fragile as it doesn't handle the case when attributes (aka Properties) or the item doesn't exist. Doing a full-table scan is pretty straight-forward although as you're scanning the entire table you'll want to implement the paging to scan through all items, which looks like: ```csharp //AWSSDK var request = new ScanRequest { TableName = "Todo", Limit = 1000, }; var allTodos = new List<Todo>(); ScanResponse response = null; do { if (response != null) request.ExclusiveStartKey = response.LastEvaluatedKey; response = awsDb.Scan(request); foreach (var item in response.Items) { var todo = new Todo { Id = Convert.ToInt64(item["Id"].N), Content = item["Content"].S, Order = Convert.ToInt32(item["Order"].N), Done = item["Done"].BOOL, }; allTodos.Add(todo); } } while (response.LastEvaluatedKey != null && response.LastEvaluatedKey.Count > 0); allTodos.PrintDump(); ``` ## Deleting an Item with PocoDynamo Deleting an item is similar to getting an item which just needs the generic type and primary key: ```csharp //PocoDynamo db.DeleteItem<Todo>(1); ``` ## Deleting an Item with AWSSDK Which just sends a `DeleteItemRequest` to delete the Item: ```csharp //AWSSDK var request = new DeleteItemRequest { TableName = "Todo", Key = new Dictionary<string, AttributeValue> { { "Id", new AttributeValue { N = "1"} } }, }; awsDb.DeleteItem(request); ``` ## Updating an Item with PocoDynamo The simplest usage is to pass in a partially populated POCO where any Hash or Range Keys are added to the Key Condition and any non-default values are replaced. E.g the query below updates the Customer's Age to **42**: ```csharp db.UpdateItemNonDefaults(new Customer { Id = customer.Id, Age = 42 }); ``` DynamoDB's UpdateItem supports 3 different operation types: - `PUT` to replace an Attribute Value - `ADD` to add to an existing Attribute Value - `DELETE` to delete the specified Attributes Examples of all 3 are contained in the examples below which changes the Customer's `Nationality` to **Australian**, reduces their `Age` by **1** and deletes their `Name` and `Orders`: ```csharp db.UpdateItem(customer.Id, put: () => new Customer { Nationality = "Australian" }, add: () => new Customer { Age = -1 }, delete: x => new { x.Name, x.Orders }); ``` The same Typed API above is also available in the more flexible and untyped form below: ```csharp db.UpdateItem<Customer>(new DynamoUpdateItem { Hash = customer.Id, Put = new Dictionary<string, object> { { "Nationality", "Australian" }, }, Add = new Dictionary<string, object> { { "Age", -1 } }, Delete = new[] { "Name", "Orders" }, }); ``` ### Update with Conditional Expressions PocoDynamo also has Typed API support for [DynamoDB Conitional Expressions](http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_PutItem.html#API_PutItem_RequestSyntax) by using the `Condition()` API, e.g: ```csharp var q = db.UpdateExpression<Customer>(customer.Id) .Set(() => new Customer { Nationality = "Australian" }) .Add(() => new Customer { Age = decrBy }) .Remove(x => new { x.Name, x.Orders }) .Condition(x => x.Age == 27); var succeeded = db.UpdateItem(q); ``` ## Querying The simple Todo example should give you a feel for using PocoDynamo to handle basic CRUD operations. Another area where PocoDynamo adds a lot of value which can be fairly cumbersome to do without, is in creating [Query and Scan](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/QueryAndScan.html) requests to query data in DynamoDB Tables. ### QueryExpressions are QueryRequests The query functionality in PocoDynamo is available on the `QueryExpression<T>` class which is used as a typed query builder to construct your Query request. An important attribute about QueryExpression's are that they inherit AWSSDK's [QueryRequest](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LowLevelDotNetQuerying.html) Request DTO. This provides a number of benefits, they're easy to use and highly introspectable since each API just populates different fields in the Request DTO. They're also highly reusable as QueryExpressions can be executed as-is in AWSSDK DynamoDB client and vice-versa with PocoDynamo's `Query` API's executing both `QueryExpression<T>` and `QueryRequest` DTOs. The difference with PocoDynamo's Query API is that they provide managed exeuction, lazy evaluation, paged queries and auto-conversion of dynamic results into typed POCOs. ### Query Usage [DynamoDB Query's](http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_Query.html) enable efficient querying of data in DynamoDB as it's limited to querying the indexed Hash and Range Keys on your Tables or Table Indexes. Although it has the major limitation that it always needs to specify a Hash condition, essentially forcing the query to be scoped to a single partition. This makes it fairly useless for Tables with only a single Hash Primary Key like `Todo` as the query condition will always limit to a maximum of 1 result. Nevertheless we can still use it to show how to perform server-side queries with PocoDynamo. To create a QueryExpression use the `FromQuery*` API's. It accepts a `KeyConditionExpression` as the first argument given it's a mandatory requirement for Query Requests which uses it to identify the partition the query should be executed on: ```csharp var q = db.FromQuery<Todo>(x => x.Id == 1); ``` #### Key Condition and Filter Expressions PocoDynamo parses this lambda expression to return a populated `QueryExpression<Todo>` which you can inspect to find the `TableName` set to **Todo** and the `KeyConditionExpression` set to **(Id = :k0)** with the `ExpressionAttributeValues` Dictionary containing a Numeric value of **1** for the key **:k0**. From here you can continue constructing the QueryRequest DTO by populating its properties directly or by calling `QueryExpression` high-level methods (modeled after the properties they populate), e.g. the `KeyCondition()` method populates the `KeyConditionExpression` property, `Filter()` populates the `FilterExpression` property and any arguments used in any expression are automatically parameterized and added to the `ExpressionAttributeValues` collection: ```csharp var q = db.FromQuery<Todo>() .KeyCondition(x => x.Id == 1) //Equivalent to: db.FromQuery<Todo>(x => x.Id == 1) .Filter(x => x.Done); q.TableName // Todo q.KeyConditionExpression // (Id = :k0) q.FilterExpression // Done = :true q.ExpressionAttributeValues // :k0 = AttributeValue {N=1}, :true = AttributeValue {BOOL=true} ``` Filter expressions are applied after the query is executed which enable more flexible querying as they're not just limited to key fields and can be used to query any field to further filter the returned resultset. ### Executing Queries After you've finished populating the Request DTO it can be executed with PocoDynamo's `Query()`. This returns a lazily evaluated resultset which you can use LINQ methods on to fetch the results. Given the primary key condition we know this will only return 0 or 1 rows based on whether or not the TODO has been completed which we can check with by calling LINQ's `FirstOrDefault()` method: ```csharp var todo1Done = db.Query(q).FirstOrDefault(); ``` Where `todo1Done` will hold the populated `Todo` if it was marked done, otherwise it will be `null`. #### Expression Chaining Most `QueryExpression` methods returns itself and an alternative to calling `Query` on PocoDynamo (or AWSSDK) to execute the Query, you can instead call the `Exec()` alias. This allows you to create and execute your DynamoDb Query in a single expression which can instead be rewritten as: ```csharp var todo1Done = db.FromQuery<Todo>(x => x.Id == 1) .Filter(x => x.Done) .Exec() .FirstOrDefault(); ``` ### Scan Operations Scan Operations work very similar to Query Operations but instead of using a `QueryExpression<T>` you would instead use a `ScanExpression<T>` which as it inherits from AWSSDK's `ScanRequest` Request DTO, provides the same reuse benefits as QueryExpression's. To create a Scan Request you would use the `FromScan<T>` API, e.g: ```csharp var q = db.FromScan<Todo>(); ``` More examples of how to use typed LINQ expressions for creating and executing Query and Scan requests are described later. ### Related Items DynamoDB Queries are ideally suited for when the dataset is naturally isolated, e.g. multi-tenant Apps that are centered around Customer data so any related records are able to share the same `CustomerId` Hash Key. PocoDynamo has good support for maintaining related data which can re-use the same Data Annotations used to define POCO relationships in OrmLite, often letting you reuse your existing OrmLite RDBMS data models in DynamoDB as well. To illustrate how to use PocoDynamo to maintain related data we'll walk through a typical Customer and Orders example: ```csharp public class Customer { [AutoIncrement] public int Id { get; set; } public string Name { get; set; } public CustomerAddress PrimaryAddress { get; set; } } public class CustomerAddress { [AutoIncrement] public int Id { get; set; } public string Address { get; set; } public string State { get; set; } public string Country { get; set; } } [Alias("CustomerOrder")] public class Order { [AutoIncrement] public int Id { get; set; } [References(typeof(Customer))] public int CustomerId { get; set; } public string Product { get; set; } public int Qty { get; set; } [Index] public virtual decimal Cost { get; set; } } ``` In order to use them we need to tell PocoDynamo which of the Types are Tables that it should create in DynamoDB which we can do by registering them with PocoDynamo then calling `InitSchema()` which will go through and create any of the tables that don't yet exist in DynamoDB: ```csharp db = new PocoDynamo(awsDb) .RegisterTable<Customer>() .RegisterTable<Order>(); db.InitSchema(); ``` `InitSchema()` will also wait until the tables have been created so they're immediately accessible afterwards. As creating DynamoDB tables can take upwards of a minute in AWS you can use the [alternative Async APIs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/IPocoDynamoAsync.cs) if you wanted to continue to doing other stuff whilst the tables are being created in AWS, e.g: ```csharp var task = db.InitSchemaAsync(); // do other stuff... await task; ``` ## Related Data After the tables are created we can insert the top-level Customer record as normal: ```csharp var customer = new Customer { Name = "Customer", PrimaryAddress = new CustomerAddress { Address = "1 road", State = "NT", Country = "Australia", } }; db.PutItem(customer); ``` Before adding the record, PocoDynamo also populates any `[AutoIncrement]` properties with the next number in the sequence for that Type. Any complex types stored on the `Customer` POCO like `CustomerAddress` gets persisted along with the containing `Customer` entry and converted into a **Map** of DynamoDB Attribute Value pairs. We can view the [DynamoDB Web Console](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/ConsoleDynamoDB.html) to see how this is stored in DynamoDB: ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/related-customer.png) ### Related Tables You can define a related table using the `[References]` attribute to tell PocoDynamo what the parent table is, e.g: ```csharp [Alias("CustomerOrder")] public class Order { [AutoIncrement] public int Id { get; set; } [References(typeof(Customer))] public int CustomerId { get; set; } //... } ``` Which PocoDynamo infers to create the table using the parent's `CustomerId` as its Hash Key, relegating its `Id` as the Range Key for the table. This ensures the Order is kept in the same partition as all other related Customer Data, necessary in order to efficiently query a Customer's Orders. When both the Hash and Range Key are defined they're treated as the Composite Key for that table which needs to be unique for each item - guaranteed when using `[AutoIncrement]` Id's. #### Inserting Related Data After the table is created we can generate and insert random orders like any other table, e.g: ```csharp var orders = 10.Times(i => new Order { CustomerId = customer.Id, Product = "Item " + (i % 2 == 0 ? "A" : "B"), Qty = i + 2, Cost = (i + 2) * 2 }); db.PutItems(orders); ``` You can also use the alternative `PutRelatedItems()` API and get PocoDynamo to take care of populating the `CustomerId`: ```csharp var orders = 10.Times(i => new Order { Product = "Item " + (i % 2 == 0 ? "A" : "B"), Qty = i + 2, Cost = (i + 2) * 2 }); db.PutRelatedItems(customer.Id, orders); ``` Both examples results in the same data being inserted into the **CustomerOrder** DynamoDB table: ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/related-customer-orders.png) This also shows how the `[Alias]` attribute can be used to rename the `Order` Type as **CustomerOrder** in DynamoDB. ### Querying Related Tables Now we have related data we can start querying it, something you may want to do is fetch all Customer Orders: ```csharp var q = db.FromQuery<Order>(x => x.CustomerId == customer.Id); var dbOrders = db.Query(q); ``` As getting related Items for a Hash Key is a popular query, it has an explicit API: ```csharp var dbOrders = db.GetRelatedItems<Order>(customer.Id); ``` We can refine the query further by specifying a `FilterExpression` to limit the results DynamoDB returns: ```csharp var expensiveOrders = q.Clone() .Filter(x => x.Cost > 10) .Exec(); ``` > Using `Clone()` will create and modify a copy of the query, leaving the original one intact. ### [Local Secondary Indexes](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LSI.html) But filters aren't performed on an Index and can be inefficient if your table has millions of customer rows. By default only the Hash and Range Key are indexed, in order to efficiently query any other field you will need to create a [Local Secondary Index](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LSI.html) for it. This is easily done in PocoDynamo by annotating the properties you want indexed with the `[Index]` attribute: ```csharp public class Order { //... [Index] public decimal Cost { get; set; } } ``` Which tells PocoDynamo to create a Local Secondary Index for the `Cost` property when it creates the table. When one exists, you can query a Local Index with `LocalIndex()`: ```csharp var expensiveOrders = q .LocalIndex(x => x.Cost > 10) .Exec(); ``` Which now performs the Cost query on an index. Although this only returns a partially populated Order, specifically with just the Hash Key (CustomerId), Range Key (Id) and the field that's indexed (Cost): ```cs expensiveOrders.PrintDump(); ``` ```js [ { Id: 5, CustomerId: 1, Qty: 0, Cost: 12 }, //... ] ``` This is due to [Local Secondary Indexes](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LSI.html) being just denormalized tables behind the scenes which by default only returns re-projected fields that were defined when the Index was created. One way to return populated orders is to specify a custom `ProjectionExpression` with the fields you want returned. E.g. You can create a request with a populated `ProjectionExpression` that returns all Order fields with: ```csharp var expensiveOrders = q .LocalIndex(x => x.Cost > 10) .Select<Order>() //Equivalent to: SelectTableFields() .Exec(); ``` Which now returns: ```cs expensiveOrders.PrintDump(); ``` ```js [ { Id: 5, CustomerId: 1, Product: Item A, Qty: 6, Cost: 12 }, //... ] ``` ### Typed Local Indexes Using a custom `ProjectionExpression` is an easy work-around, although for it to work DynamoDB needs to consult the primary table to fetch the missing fields for each item. For large tables that are frequently accessed, the query can be made more efficient by projecting the fields you want returned when the Index is created. You can can tell PocoDynamo which additional fields it should reproject by creating a **Typed Local Index** which is just a POCO implementing `ILocalIndex<T>` containing all the fields the index should contain, e.g: ```csharp public class OrderCostLocalIndex : ILocalIndex<Order> { [Index] public decimal Cost { get; set; } public int CustomerId { get; set; } public int Id { get; set; } public int Qty { get; set; } } [References(typeof(OrderCostLocalIndex))] public class Order { ... } ``` Then use the `[References]` attribute to register the Typed Index so PocoDynamo knows which additional indexes needs to be created with the table. The `[Index]` attribute is used to specify which field is indexed (Range Key) whilst the `CustomerId` is automatically used the Hash Key for the Local Index Table. #### Querying Typed Indexes To query a typed Index, use `FromQueryIndex<T>()` which returns a populated Query Request with the Table and Index Name. As `Cost` is now the Range Key of the Local Index table it can be queried together with the `CustomerId` Hash Key in the Key Condition expression: ```csharp List<OrderCostLocalIndex> expensiveOrderIndexes = db.FromQueryIndex<OrderCostLocalIndex>(x => x.CustomerId == customer.Id && x.Cost > 10) .Exec(); ``` This returns a list of populated indexes that now includes the `Qty` field: ```cs expensiveOrderIndexes.PrintDump(); ``` ```js [ { Cost: 12, CustomerId: 1, Id: 5, Qty: 6 }, //... ] ``` If preferred you can easily convert Typed Index into Orders by using ServiceStack's [built-in Auto-Mapping](/auto-mapping), e.g: ```csharp List<Order> expensiveOrders = expensiveOrderIndexes .Map(x => x.ConvertTo<Order>()); ``` ### [Global Secondary Indexes](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GSI.html) The major limitation of Local Indexes is that they're limited to querying data in the same partition (Hash Key). To efficiently query an index spanning the entire dataset, you need to instead use a [Global Secondary Index](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GSI.html). Support for Global Indexes in PocoDynamo is similar to Typed Local Indexes, but instead implements `IGlobalIndex<T>`. They also free you to choose a new Hash Key, letting you create an Index spanning all Customers. For example we can create a global index that lets us search the cost across all orders containing a particular product: ```csharp public class OrderCostGlobalIndex : IGlobalIndex<Order> { [HashKey] public string Product { get; set; } [Index] public decimal Cost { get; set; } public int CustomerId { get; set; } public int Qty { get; set; } public int Id { get; set; } } [References(typeof(OrderCostGlobalIndex))] public class Order { ... } ``` Our Key Condition can now instead query Product and Cost fields across all Customer Orders: ```csharp var expensiveItemAOrders = db.FromQueryIndex<OrderCostGlobalIndex>(x => x.Product == "Item A" && x.Cost > 10) .Exec(); ``` Which will print all **Item A** Orders with a **Cost > 10**: ```cs expensiveItemAOrders.PrintDump(); ``` ```js [ { Product: Item A, Cost: 12, CustomerId: 1, Qty: 6, Id: 5 }, //... ] ``` ## Scan Requests You'll want to just use queries for any frequently accessed code running in production, although the full querying flexibility available in full table scan requests can be useful for ad hoc querying and to speed up development cycles by initially starting with Scan queries then when the data requirements for your App's have been finalized, rewrite them to use indexes and queries. To create Scan Requests you instead call the `FromScan*` API's, e.g: ```csharp var allOrders = db.ScanAll<Order>(); var expensiveOrders = db.FromScan<Order>(x => x.Cost > 10) .Exec(); ``` You can also perform scans on Global Indexes, but unlike queries they don't need to be limited to the Hash Key: ```csharp var expensiveOrderIndexes = db .FromScanIndex<OrderCostGlobalIndex>(x => x.Cost > 10) .Exec(); ``` Just like `QueryExpression<T>` the populated `ScanExpression<T>` inherits from AWSSDK's `ScanRequest` enabling the same re-use benefits for `ScanRequest` as they do for QueryRequest's. ## Query and Scan Expressions Both Scans and Query expressions benefit from a Typed LINQ-like expression API which can be used to populate the DTO's - **KeyConditionExpression** - for specifying conditions on tables Hash and Range keys (only: QueryRequest) - **FilterExpression** - for specifying conditions to filter results on other fields - **ProjectionExpression** - to specify any custom fields (default: all fields) Each `QueryRequest` needs to provide a key condition which can be done when creating the QueryExpression: ```csharp var orders = db.FromQuery<Order>(x => x.CustomerId == 1).Exec(); // Alternative explicit API var expensiveOrders = db.FromQuery<Order>().KeyCondition(x => x.CustomerId == 1).Exec(); ``` Whilst every condition on a `ScanRequest` is added to the FilterExpression: ```csharp var expensiveOrders = db.FromScan<Order>(x => x.Cost > 10).Exec(); // Alternative explicit API var expensiveOrders = db.FromScan<Order>().Filter(x => x.Cost > 10).Exec(); ``` Calling `Exec()` returns a lazily executed response which transparently sends multiple paged requests to fetch the results as needed, e.g calling LINQ's `.FirstOrDefault()` only makes a single request whilst `.ToList()` fetches the entire resultset. All streaming `IEnumerable<T>` requests are sent with the configured `PagingLimit` (default: 1000). #### Custom Limits Several of PocoDynamo API's have overloads that let you specify a custom limit. API's with limits are instead executed immediately with the limit specified and returned in a concrete List: ```csharp List<Order> expensiveOrders = db.FromScan<Order>().Filter(x => x.Cost > 10).Exec(limit:5); ``` ### Custom Filter Expressions There are also custom overloads that can be used to execute a custom expression when more flexibility is needed: ```csharp // Querying by Custom Filter Condition with anon args var expensiveOrders = db.FromScan<Order>().Filter("Cost > :amount", new { amount = 10 }).Exec(); // Querying by Custom Filter Condition with loose-typed Dictionary var expensiveOrders = db.FromScan<Order>().Filter("Cost > :amount", new Dictionary<string, object> { { "amount", 10 } }) .Exec(); ``` ### Custom Select Projections By default queries return all fields defined on the POCO model. You can also customize the projected fields that are returned with the `Select*` and `Exec*` APIs: ```csharp // Return partial fields from anon object var partialOrders = db.FromScan<Order>().Select(x => new { x.CustomerId, x.Cost }).Exec(); // Return partial fields from array var partialOrders = db.FromScan<Order>().Select(x => new[] { "CustomerId", "Cost" }).Exec(); // Return partial fields defined in a custom Poco class CustomerCost { public int CustomerId { get; set; } public virtual decimal Cost { get; set; } } var custCosts = db.FromScan<Order>().Select<CustomerCost>() .Exec() .Map(x => x.ConvertTo<CustomerCost>()); // Alternative shorter version of above var custCosts = db.FromScan<Order>().ExecInto<CustomerCost>().ToList(); // Useful when querying and index and returing results in primary Order Poco List<Order> expensiveOrders = db.FromScanIndex<OrderCostGlobalIndex>(x => x.Cost > 10) .ExecInto<Order>(); // Return a single column of fields List<int> orderIds = db.FromScan<Order>().ExecColumn(x => x.Id).ToList(); ``` ### Advanced LINQ Expressions In addition to basic predicate conditions, DynamoDB also includes support for [additional built-in functions](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Expressions.SpecifyingConditions.html) which PocoDynamo also provides typed LINQ support for: #### begins_with Return items where string fields starts with a particular substring: ```csharp var orders = db.FromScan<Order>(x => x.Product.StartsWith("Item A")).Exec(); // Equivalent to var orders = db.FromScan<Order>(x => Dynamo.BeginsWith(x.Product, "Item A")).Exec(); var orders = db.FromScan<Order>().Filter("begins_with(Product, :s)", new { s = "Item A" }).Exec(); ``` #### contains Return items where string fields contains a particular substring: ```csharp var orders = db.FromScan<Order>(x => x.Product.Contains("em A")).Exec(); // Equivalent to var orders = db.FromScan<Order>(x => Dynamo.Contains(x.Product, "em A")).Exec(); var orders = db.FromScan<Order>().Filter("contains(Product, :s)", new { s = "em A" }).Exec(); ``` #### in Returns items where fields exist in a particular collection: ```csharp var qtys = new[] { 5, 10 }; var orders = db.FromScan<Order>(x => qtys.Contains(x.Qty)).Exec(); // Equivalent to var orders = db.FromScan<Order>(x => Dynamo.In(x.Qty, qtys)).Exec(); var orders = db.FromScan<Order>().Filter("Qty in(:q1,:q2)", new { q1 = 5, q2 = 10 }).Exec(); ``` #### size Returns items where the string length equals a particular size: ```csharp var orders = db.FromScan<Order>(x => x.Product.Length == 6).Exec(); // Equivalent to var orders = db.FromScan<Order>(x => Dynamo.Size(x.Product) == 6).Exec(); var orders = db.FromScan<Order>().Filter("size(Product) = :n", new { n = 6 }).Exec(); ``` Size also works for querying the size of different native DynamoDB collections, e.g: ```csharp public class IntCollections { public int Id { get; set; } public int[] ArrayInts { get; set; } public HashSet<int> SetInts { get; set; } public List<int> ListInts { get; set; } public Dictionary<int, int> DictionaryInts { get; set; } } var results = db.FromScan<IntCollections>(x => x.ArrayInts.Length == 10 && x.SetInts.Count == 10 && x.ListInts.Count == 10 && x.DictionaryInts.Count == 10) .Exec(); ``` #### between Returns items where field values fall within a particular range (inclusive): ```csharp var orders = db.FromScan<Order>(x => Dynamo.Between(x.Qty, 3, 5)).Exec(); // Equivalent to var orders = db.FromScan<Order>(x => x.Qty >= 3 && x.Qty <= 5).Exec(); var orders = db.FromScan<Order>().Filter("Qty between :from and :to", new { from = 3, to = 5 }).Exec(); ``` #### attribute_type Return items where field is of a particular type: ```csharp var orders = db.FromScan<Order>(x => Dynamo.AttributeType(x.Qty, DynamoType.Number) && Dynamo.AttributeType(x.Product, DynamoType.String)) .Exec(); // Equivalent to var orders = db.FromScan<Order>().Filter( "attribute_type(Qty, :n) and attribute_type(Product, :s)", new { n = "N", s = "S"}) .Exec(); ``` Valid Types: L (List), M (Map), S (String), SS (StringSet), N (Number), NS (NumberSet), B (Binary), BS, BOOL, NULL #### attribute_exists Return items where a particular field exists. As the schema of your data models evolve you can use this to determine whether items are of an old or new schema: ```csharp var newOrderTypes = db.FromScan<Order>(x => Dynamo.AttributeExists(x.NewlyAddedField)).Exec(); // Equivalent to var newOrderTypes = db.FromScan<Order>().Filter("attribute_exists(NewlyAddedField)").Exec(); ``` #### attribute_not_exists Return items where a particular field does not exist: ```csharp var oldOrderTypes = db.FromScan<Order>(x => Dynamo.AttributeNotExists(x.NewlyAddedField)).Exec(); // Equivalent to var oldOrderTypes = db.FromScan<Order>().Filter("attribute_not_exists(NewlyAddedField)").Exec(); ``` ### Defaults and Custom Behavior PocoDynamo is configured with the defaults below which it uses throughout its various API's when used in creating and querying tables: ```csharp //Defaults: var db = new PocoDynamo(awsDb) { PollTableStatus = TimeSpan.FromSeconds(2), MaxRetryOnExceptionTimeout = TimeSpan.FromSeconds(60), ReadCapacityUnits = 10, WriteCapacityUnits = 5, ConsistentRead = true, ScanIndexForward = true, PagingLimit = 1000, }; ``` If you wanted to query with different behavior you can create a clone of the client with the custom settings you want, e.g. you can create a client that performs eventually consistent queries with: ```csharp IPocoDynamo eventuallyConsistentDb = db.ClientWith(consistentRead:false); ``` ## Table definition To support different coding styles, readability/dependency preferences and levels of data model reuse, PocoDynamo enables a wide array of options for specifying a table's Hash and Range Keys, in the following order or precedence: **Note: Hash and Range keys cannot be read-only calculated properties. ### Specifying a Hash Key Using the AWSSDK's `[DynamoDBHashKey]` attribute: ```csharp public class Table { [DynamoDBHashKey] public int CustomId { get; set; } } ``` This requires your models to have a dependency to the **AWSSDK.DynamoDBv2** NuGet package which can be avoided by using **ServiceStack.Interfaces** `[HashKey]` attribute instead which your models already likely have a reference to: ```csharp public class Table { [HashKey] public int CustomId { get; set; } } ``` You can instead avoid any attributes using the explicit **HashKey** Naming convention: ```csharp public class Table { public int HashKey { get; set; } } ``` For improved re-usability of your models you can instead use the generic annotations for defining a model's primary key: ```csharp public class Table { [PrimaryKey] public int CustomId { get; set; } } ``` ```csharp public class Table { [AutoIncrement] public int CustomId { get; set; } } ``` Alternative using the Universal `Id` naming convention: ```csharp public class Table { public int Id { get; set; } } ``` If preferred both Hash and Range Keys can be defined together with the class-level `[CompositeKey]` attribute: ```csharp [CompositeKey("CustomHash", "CustomRange")] public class Table { public int CustomHash { get; set; } public int CustomRange { get; set; } } ``` ### Specifying a Range Key For specifying the Range Key use can use the **AWSSDK.DynamoDBv2** Attribute: ```csharp public class Table { [DynamoDBRangeKey] public int CustomId { get; set; } } ``` The **ServiceStack.Interfaces** attribute: ```csharp public class Table { [RangeKey] public int CustomId { get; set; } } ``` Or without attributes, using the explicit `RangeKey` property name: ```csharp public class Table { public int RangeKey { get; set; } } ``` ## Examples ### [DynamoDbCacheClient](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbCacheClient.cs) We've been quick to benefit from the productivity advantages of PocoDynamo ourselves where we've used it to rewrite [DynamoDbCacheClient](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbCacheClient.cs) which is now just 2/3 the size and much easier to maintain than the existing [Community-contributed version](https://github.com/ServiceStack/ServiceStack/blob/22aca105d39997a8ea4c9dc20b242f78e07f36e0/src/ServiceStack.Caching.AwsDynamoDb/DynamoDbCacheClient.cs) whilst at the same time extending it with even more functionality where it now implements the `ICacheClientExtended` API. ### [DynamoDbAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbAuthRepository.cs) PocoDynamo's code-first Typed API made it much easier to implement value-added DynamoDB functionality like the new [DynamoDbAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbAuthRepository.cs) which due sharing a similar code-first POCO approach to OrmLite, ended up being a straight-forward port of the existing [OrmLiteAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Server/Auth/OrmLiteAuthRepository.cs) where it was able to reuse the existing `UserAuth` and `UserAuthDetails` data models. ### [DynamoDbTests](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack.Aws/tests/ServiceStack.Aws.DynamoDbTests) Despite its young age we've added a comprehensive test suite behind PocoDynamo which has become our exclusive client for developing DynamoDB-powered Apps. ### AWS Apps The [Live Demos](https://github.com/ServiceStackApps/LiveDemos) below were rewritten from their original RDBMS and OrmLite backends to utilize a completely managed AWS Stack that now uses PocoDynamo and a DynamoDB-backend: ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/examples-razor-rockstars.png) ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/examples-email-contacts.png) ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/examples-todos.png) ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/examples-awsauth.png) ## IPocoClient API ```csharp // Interface for the code-first PocoDynamo client public interface IPocoDynamo : IPocoDynamoAsync, IRequiresSchema { // Get the underlying AWS DynamoDB low-level client IAmazonDynamoDB DynamoDb { get; } // Get the numeric unique Sequence generator configured with this client ISequenceSource Sequences { get; } // Access the converters that converts POCO's into DynamoDB data types DynamoConverters Converters { get; } // How long should PocoDynamo keep retrying failed operations in an exponential backoff (default 60s) TimeSpan MaxRetryOnExceptionTimeout { get; } // Get the AWSSDK DocumentModel schema for this Table Table GetTableSchema(Type table); // Get PocoDynamo Table metadata for this table DynamoMetadataType GetTableMetadata(Type table); // Calls 'ListTables' to return all Table Names in DynamoDB IEnumerable<string> GetTableNames(); // Creates any tables missing in DynamoDB from the Tables registered with PocoDynamo bool CreateMissingTables(IEnumerable<DynamoMetadataType> tables, TimeSpan? timeout = null); // Creates any tables missing from the specified list of tables bool CreateTables(IEnumerable<DynamoMetadataType> tables, TimeSpan? timeout = null); // Deletes all DynamoDB Tables bool DeleteAllTables(TimeSpan? timeout = null); // Deletes the tables in DynamoDB with the specified table names bool DeleteTables(IEnumerable<string> tableNames, TimeSpan? timeout = null); // Gets the POCO instance with the specified hash T GetItem<T>(object hash); // Gets the POCO instance with the specified hash and range value T GetItem<T>(object hash, object range); // Calls 'BatchGetItem' in the min number of batch requests to return POCOs with the specified hashes List<T> GetItems<T>(IEnumerable<object> hashes); // Calls 'PutItem' to store instance in DynamoDB T PutItem<T>(T value, bool returnOld = false); // Calls 'BatchWriteItem' to efficiently store items in min number of batched requests void PutItems<T>(IEnumerable<T> items); // Deletes the instance at the specified hash T DeleteItem<T>(object hash, ReturnItem returnItem = ReturnItem.None); // Calls 'BatchWriteItem' to efficiently delete all items with the specified hashes void DeleteItems<T>(IEnumerable<object> hashes); // Calls 'BatchWriteItem' to efficiently delete all items with the specified hash and range pairs void DeleteItems<T>(IEnumerable<DynamoId> hashes); // Calls 'UpdateItem' with ADD AttributeUpdate to atomically increment specific field numeric value long Increment<T>(object hash, string fieldName, long amount = 1); // Polls 'DescribeTable' until all Tables have an ACTIVE TableStatus bool WaitForTablesToBeReady(IEnumerable<string> tableNames, TimeSpan? timeout = null); // Polls 'ListTables' until all specified tables have been deleted bool WaitForTablesToBeDeleted(IEnumerable<string> tableNames, TimeSpan? timeout = null); // Updates item Hash field with hash value then calls 'PutItem' to store the related instance void PutRelatedItem<T>(object hash, T item); // Updates all item Hash fields with hash value then calls 'PutItems' to store all related instances void PutRelatedItems<T>(object hash, IEnumerable<T> items); // Calls 'Query' to return all related Items containing the specified hash value IEnumerable<T> GetRelatedItems<T>(object hash); // Deletes all items with the specified hash and ranges void DeleteRelatedItems<T>(object hash, IEnumerable<object> ranges); // Calls 'Scan' to return lazy enumerated results that's transparently paged across multiple queries IEnumerable<T> ScanAll<T>(); // Creates a Typed `ScanExpression` for the specified table ScanExpression<T> FromScan<T>(Expression<Func<T, bool>> filterExpression = null); // Creates a Typed `ScanExpression` for the specified Global Index ScanExpression<T> FromScanIndex<T>(Expression<Func<T, bool>> filterExpression = null); // Executes the `ScanExpression` returning the specified maximum limit of results List<T> Scan<T>(ScanExpression<T> request, int limit); // Executes the `ScanExpression` returning lazy results transparently paged across multiple queries IEnumerable<T> Scan<T>(ScanExpression<T> request); // Executes AWSSDK `ScanRequest` returning the specified maximum limit of results List<T> Scan<T>(ScanRequest request, int limit); // Executes AWSSDK `ScanRequest` returning lazy results transparently paged across multiple queries IEnumerable<T> Scan<T>(ScanRequest request); // Executes AWSSDK `ScanRequest` with a custom conversion function to map ScanResponse to results IEnumerable<T> Scan<T>(ScanRequest request, Func<ScanResponse, IEnumerable<T>> converter); // Return Live ItemCount using Table ScanRequest long ScanItemCount<T>(); // Return cached ItemCount in summary DescribeTable long DescribeItemCount<T>(); // Creates a Typed `QueryExpression` for the specified table QueryExpression<T> FromQuery<T>(Expression<Func<T, bool>> keyExpression = null); // Executes the `QueryExpression` returning lazy results transparently paged across multiple queries IEnumerable<T> Query<T>(QueryExpression<T> request); // Executes the `QueryExpression` returning the specified maximum limit of results List<T> Query<T>(QueryExpression<T> request, int limit); // Creates a Typed `QueryExpression` for the specified Local or Global Index QueryExpression<T> FromQueryIndex<T>(Expression<Func<T, bool>> keyExpression = null); // Executes AWSSDK `QueryRequest` returning the specified maximum limit of results List<T> Query<T>(QueryRequest request, int limit); // Executes AWSSDK `QueryRequest` returning lazy results transparently paged across multiple queries IEnumerable<T> Query<T>(QueryRequest request); // Executes AWSSDK `QueryRequest` with a custom conversion function to map QueryResponse to results IEnumerable<T> Query<T>(QueryRequest request, Func<QueryResponse, IEnumerable<T>> converter); // Create a clone of the PocoDynamo client with different default settings IPocoDynamo ClientWith( bool? consistentRead = null, long? readCapacityUnits = null, long? writeCapacityUnits = null, TimeSpan? pollTableStatus = null, TimeSpan? maxRetryOnExceptionTimeout = null, int? limit = null, bool? scanIndexForward = null); // Disposes the underlying IAmazonDynamoDB client void Close(); } // Available API's with Async equivalents public interface IPocoDynamoAsync { Task CreateMissingTablesAsync(IEnumerable<DynamoMetadataType> tables, CancellationToken token = default(CancellationToken)); Task WaitForTablesToBeReadyAsync(IEnumerable<string> tableNames, CancellationToken token = default(CancellationToken)); Task InitSchemaAsync(); } ``` ### PocoDynamo Extension helpers To maintain a minimumal surface area for PocoDynamo, many additional API's used to provide a more DRY typed API's were moved into [PocoDynamoExtensions](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/PocoDynamoExtensions.cs) ```csharp class PocoDynamoExtensions { //Register Table DynamoMetadataType RegisterTable<T>(); DynamoMetadataType RegisterTable(Type tableType); void RegisterTables(IEnumerable<Type> tableTypes); void AddValueConverter(Type type, IAttributeValueConverter valueConverter); //Get Table Metadata Table GetTableSchema<T>(); DynamoMetadataType GetTableMetadata<T>(); //Create Table bool CreateTableIfMissing<T>(); bool CreateTableIfMissing(DynamoMetadataType table); bool CreateTable<T>(TimeSpan? timeout = null); bool DeleteTable<T>(TimeSpan? timeout = null); //Decrement API's long DecrementById<T>(object id, string fieldName, long amount = 1); long IncrementById<T>(object id, Expression<Func<T, object>> fieldExpr, long amount = 1); long DecrementById<T>(object id, Expression<Func<T, object>> fieldExpr, long amount = 1); List<T> GetAll<T>(); T GetItem<T>(DynamoId id); //Typed API overloads for popular hash object ids List<T> GetItems<T>(IEnumerable<int> ids); List<T> GetItems<T>(IEnumerable<long> ids); List<T> GetItems<T>(IEnumerable<string> ids); void DeleteItems<T>(IEnumerable<int> ids); void DeleteItems<T>(IEnumerable<long> ids); void DeleteItems<T>(IEnumerable<string> ids); //Scan Helpers IEnumerable<T> ScanInto<T>(ScanExpression request); List<T> ScanInto<T>(ScanExpression request, int limit); //Query Helpers IEnumerable<T> QueryInto<T>(QueryExpression request); List<T> QueryInto<T>(QueryExpression request, int limit); } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ormlite/aws-rds-databases.md --- title: OrmLite support for AWS & RDS Managed Databases --- ## AWS RDS Support OrmLite has great support AWS's managed RDS Databases, follow these getting started guides to help to get up and running quickly: - [PostgreSQL](https://github.com/ServiceStackApps/AwsGettingStarted#getting-started-with-aws-rds-postgresql-and-ormlite) - [Aurora](https://github.com/ServiceStackApps/AwsGettingStarted#getting-started-with-aws-rds-aurora-and-ormlite) - [MySQL](https://github.com/ServiceStackApps/AwsGettingStarted#getting-started-with-aws-rds-mysql-and-ormlite) - [MariaDB](https://github.com/ServiceStackApps/AwsGettingStarted#getting-started-with-aws-rds-mariadb-and-ormlite) - [SQL Server](https://github.com/ServiceStackApps/AwsGettingStarted#getting-started-with-aws-rds-sql-server-and-ormlite) Source code for example can be found on our [AwsGettingStarted](https://github.com/ServiceStackApps/AwsGettingStarted) repositories. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/aws.md --- title: ServiceStack.Aws --- ![](/img/pages/aws/servicestack-aws-banner-420.png) ## .NET before Cloud Services One thing we've missed from being based on .NET is its predisposition towards Windows-only technologies, missing out on all the industrial strength server solutions that are being primarily developed for hosting on Linux. This puts .NET at a disadvantage to other platforms which have first-class support for using the best technologies at their discretion, which outside of .NET, are primarily running on Linux servers. ## AWS's servicified platform and polyglot ecosystem By building their managed platform behind platform-agnostic web services, Amazon have largely eroded this barrier. We can finally tap into the same ecosystem [innovative Startups are using](https://techstacks.io/tech/amazon-ec2) with nothing more than the complexity cost of a service call - the required effort even further reduced with native clients. Designing its services behind message-based APIs made it much easier for Amazon to enable a new polyglot world with [native clients for most popular platforms](https://aws.amazon.com/dynamodb/developer-resources/#SDK), putting .NET on a level playing field with other platforms thanks to [AWS SDK for .NET's](http://aws.amazon.com/sdk-for-net/) well-maintained typed native clients. By providing its functionality behind well-defined services, for the first time we've seen in a long time, .NET developers are able to benefit from this new polyglot world where solutions and app logic written in other languages can be easily translated into .NET languages - a trait which has been invaluable whilst developing ServiceStack's integration support for AWS. This also means features and improvements to reliability, performance and scalability added to its back-end servers benefit every language and ecosystem using them. .NET developers are no longer at a disadvantage and can now leverage the same platform Hacker Communities and next wave of technology leading Startups are built on, benefiting from the Tech Startup culture of sharing their knowledge and experiences and pushing the limits of what's possible today. AWS offers unprecedented productivity for back-end developers, its servicified hardware and infrastructure encapsulates the complexity of managing servers at a high-level programmatic abstraction that's effortless to consume and automate. These productivity gains is why we've been running our public servers on AWS for more than 2 years. The vast array of services on offer means we have everything our solutions need within the AWS Console, our RDS managed PostgreSQL databases takes care of automated backups and software updates, ease of snapshots means we can encapsulate and backup the configuration of our servers and easily spawn new instances. AWS has made software developers more capable than ever, and with its first-class native client support leveling the playing field for .NET, there's no reason why [the next Instagram](http://highscalability.com/blog/2012/4/9/the-instagram-architecture-facebook-bought-for-a-cool-billio.html) couldn't be built by a small team of talented .NET developers. ## ServiceStack + Amazon Web Services We're excited to participate in AWS's vibrant ecosystem and provide first-class support and deep integration with AWS where ServiceStack's decoupled substitutable functionality now seamlessly integrates with popular AWS back-end technologies. It's now more productive than ever to develop and host ServiceStack solutions entirely on the managed AWS platform! ## ServiceStack.Aws All of ServiceStack's support for AWS is encapsulated within the single **ServiceStack.Aws** NuGet package which references the latest modular AWSSDK **v3.1x** dependencies **.NET 4.5+** projects can install from NuGet with: :::copy `<PackageReference Include="ServiceStack.Aws" Version="10.*" />` ::: This **ServiceStack.Aws** NuGet package includes implementations for the following ServiceStack providers: - **[PocoDynamo](/aws-pocodynamo)** - Declarative, code-first POCO client for DynamoDB with LINQ support - **[SqsMqServer](/aws#sqsmqserver)** - [MQ Server](/messaging) for invoking ServiceStack Services via Amazon SQS MQ Service - **[S3VirtualFiles](/aws#s3virtualfiles)** - A read/write [Virtual FileSystem](/virtual-file-system) around Amazon's S3 Simple Storage Service - **[DynamoDbAuthRepository](/aws#dynamodbauthrepository)** - A [UserAuth repository](/auth/authentication-and-authorization) storing UserAuth info in DynamoDB - **[DynamoDbAppSettings](/aws#dynamodbappsettings)** - An [AppSettings provider](/appsettings) storing App configuration in DynamoDB - **[DynamoDbCacheClient](/aws#managed-dynamodb-client)** - A [Caching Provider](/caching) for DynamoDB ## [AWS Live Examples](https://github.com/ServiceStackApps/AwsApps) To demonstrate the ease of which you can build AWS-powered solutions with ServiceStack we've rewritten 6 of our existing [Live Demos](https://github.com/ServiceStackApps/LiveDemos) to use a pure AWS managed backend using: - [Amazon DynamoDB](https://aws.amazon.com/dynamodb/) for data persistance - [Amazon S3](https://aws.amazon.com/s3/) for file storage - [Amazon SQS](https://aws.amazon.com/sqs/) for background processing of MQ requests - [Amazon SES](https://aws.amazon.com/ses/) for sending emails [![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/apps/screenshots/awsapps.png)](https://github.com/ServiceStackApps/AwsApps) ### Simple AppHost Configuration A good indication showing how simple it is to build ServiceStack + AWS solutions is the size of the [AppHost](https://github.com/ServiceStackApps/AwsApps/blob/master/src/AwsApps/AppHost.cs) which contains all the configuration for **5 different Apps** below utilizing all the AWS technologies listed above contained within a **single** ASP.NET Web Application where each application's UI and back-end Service implementation are encapsulated under their respective sub directories: - [/awsath](https://github.com/ServiceStackApps/AwsApps/tree/master/src/AwsApps/awsauth) - [/emailcontacts](https://github.com/ServiceStackApps/AwsApps/tree/master/src/AwsApps/emailcontacts) - [/imgur](https://github.com/ServiceStackApps/AwsApps/tree/master/src/AwsApps/imgur) - [/restfiles](https://github.com/ServiceStackApps/AwsApps/tree/master/src/AwsApps/restfiles) - [/todo](https://github.com/ServiceStackApps/AwsApps/tree/master/src/AwsApps/todo) ## [AWS Razor Rockstars](http://awsrazor.netcore.io/) [![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/apps/screenshots/awsrazor.png)](http://awsrazor.netcore.io/) ### Maintain Website Content in S3 The [implementation for AWS Razor Rockstars](https://github.com/ServiceStackApps/RazorRockstars/tree/master/src/RazorRockstars.S3) is kept with all the other ports of Razor Rockstars in the [RazorRockstars repository](https://github.com/ServiceStackApps/RazorRockstars). The main difference that stands out with [RazorRockstars.S3](https://github.com/ServiceStackApps/RazorRockstars/tree/master/src/RazorRockstars.S3) is that all the content for the App is **not** contained within project as all its Razor Views, Markdown Content, imgs, js, css, etc. are instead being served **directly from an S3 Bucket** :) This is simply enabled by overriding `GetVirtualFileSources()` and adding the new `S3VirtualFiles` to the list of file sources: ```csharp public class AppHost : AppHostBase { public override void Configure(Container container) { //All Razor Views, Markdown Content, imgs, js, css, etc are served from an S3 Bucket var s3 = new AmazonS3Client(AwsConfig.AwsAccessKey, AwsConfig.AwsSecretKey, RegionEndpoint.USEast1); VirtualFiles = new S3VirtualFiles(s3, AwsConfig.S3BucketName); } public override List<IVirtualPathProvider> GetVirtualFileSources() { //Add S3 Bucket as lowest priority Virtual Path Provider var pathProviders = base.GetVirtualFileSources(); pathProviders.Add(VirtualFiles); return pathProviders; } } ``` The code to import RazorRockstars content into an S3 bucket is trivial: we just use a local FileSystem provider to get all the files we're interested in from the main ASP.NET RazorRockstars projects folder, then write them to the configured S3 VirtualFiles Provider: ```csharp var s3Client = new AmazonS3Client(AwsConfig.AwsAccessKey, AwsConfig.AwsSecretKey, RegionEndpoint.USEast1); var s3 = new S3VirtualFiles(s3Client, AwsConfig.S3BucketName, appHost); var fs = new FileSystemVirtualPathProvider(appHost, "~/../RazorRockstars.WebHost".MapHostAbsolutePath()); var skipDirs = new[] { "bin", "obj" }; var matchingFileTypes = new[] { "cshtml", "md", "css", "js", "png", "jpg" }; //Update links to reference the new S3 AppHost.cs + RockstarsService.cs source code var replaceHtmlTokens = new Dictionary<string, string> { { "title-bg.png", "title-bg-aws.png" }, //S3 Title Background { "https://gist.github.com/3617557.js", "https://gist.github.com/mythz/396dbf54ce6079cc8b2d.js" }, { "https://gist.github.com/3616766.js", "https://gist.github.com/mythz/ca524426715191b8059d.js" }, { "RazorRockstars.WebHost/RockstarsService.cs", "RazorRockstars.S3/RockstarsService.cs" }, }; foreach (var file in fs.GetAllFiles()) { if (skipDirs.Any(x => file.VirtualPath.StartsWith(x))) continue; if (!matchingFileTypes.Contains(file.Extension)) continue; if (file.Extension == "cshtml") { var html = file.ReadAllText(); replaceHtmlTokens.Each(x => html = html.Replace(x.Key, x.Value)); s3.WriteFile(file.VirtualPath, html); } else { s3.WriteFile(file); } } ``` During the import we also update the links in the Razor `*.cshtml` pages to reference the new RazorRockstars.S3 content. ### Update S3 Bucket to enable LiveReload of Razor Views and Markdown Another nice feature of having all content maintained in an S3 Bucket is that you can just change files in the S3 Bucket directly and have all App Servers immediately reload the Razor Views, Markdown content and static resources without redeploying. #### CheckLastModifiedForChanges To enable this feature we just tell the Razor and Markdown plugins to check the source file for changes before displaying each page: ```csharp GetPlugin<MarkdownFormat>().CheckLastModifiedForChanges = true; Plugins.Add(new RazorFormat { CheckLastModifiedForChanges = true }); ``` When this is enabled the View Engines checks the ETag of the source file to find out if it's changed, if it did, it will rebuild and replace it with the new view before rendering it. Given [S3 supports object versioning](http://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html) this feature should enable a new class of use-cases for developing Content Heavy management sites with ServiceStack. #### Explicit RefreshPage One drawback of enabling `CheckLastModifiedForChanges` is that it forces a remote S3 call for each view before rendering it. A more efficient approach is to instead notify the App Servers which files have changed so they can reload them once, alleviating the need for multiple ETag checks at runtime, which is the approach we've taken with the [UpdateS3 Service](https://github.com/ServiceStackApps/RazorRockstars/blob/e159bb9d2e27eba7fc1a9ce1822b479602de8e0f/src/RazorRockstars.S3/RockstarsService.cs#L139): ```csharp if (request.Razor) { var kurtRazor = VirtualFiles.GetFile("stars/dead/cobain/default.cshtml"); VirtualFiles.WriteFile(kurtRazor.VirtualPath, UpdateContent("UPDATED RAZOR", kurtRazor.ReadAllText(), request.Clear)); HostContext.GetPlugin<RazorFormat>().RefreshPage(kurtRazor.VirtualPath); //Force reload of Razor View } var kurtMarkdown = VirtualFiles.GetFile("stars/dead/cobain/Content.md"); VirtualFiles.WriteFile(kurtMarkdown.VirtualPath, UpdateContent("UPDATED MARKDOWN", kurtMarkdown.ReadAllText(), request.Clear)); HostContext.GetPlugin<MarkdownFormat>().RefreshPage(kurtMarkdown.VirtualPath); //Force reload of Markdown ``` #### Live Reload Demo You can test live reloading of the above Service with the routes below which modify Markdown and Razor views with the current time: - [/updateS3](http://awsrazor.netcore.io/updateS3) - Update Markdown Content - [/updateS3?razor=true](http://awsrazor.netcore.io/updateS3?razor=true) - Update Razor View - [/updateS3?razor=true&clear=true](http://awsrazor.netcore.io/updateS3?razor=true&clear=true) - Revert changes ::: info This forces a recompile of the modified views which greatly benefits from a fast CPU and is a bit slow on our Live Demos server that's running on a **m1.small** instance shared with 25 other ASP.NET Web Applications ::: ## AWS Imgur ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/apps/screenshots/imgur.png) ### S3VirtualFiles The backend [ImageService.cs](https://github.com/ServiceStackApps/AwsApps/blob/master/src/AwsApps/imgur/ImageService.cs) implementation for AWS Imgur has been rewritten to use the Virtual FileSystem instead of [accessing the FileSystem directly](https://github.com/ServiceStackApps/Imgur/blob/master/src/Imgur/Global.asax.cs). The benefits of this approach is that with [2 lines of configuration](https://github.com/ServiceStackApps/AwsApps/blob/4817f5c6ad69defd74d528403bfdb03e5958b0b3/src/AwsApps/AppHost.cs#L44-L45) we can have files written to an S3 Bucket instead: ```csharp var s3Client = new AmazonS3Client(AwsConfig.AwsAccessKey, AwsConfig.AwsSecretKey, RegionEndpoint.USEast1); VirtualFiles = new S3VirtualFiles(s3Client, AwsConfig.S3BucketName); ``` If we comment out the above configuration any saved files are instead written to the local FileSystem (default). The benefit of using managed S3 File Storage is better scalability as your App Servers can remain stateless, improved performance as overhead of serving static assets can be offloaded by referencing the S3 Bucket directly and for even better responsiveness you can connect the S3 bucket to a CDN. ### Using S3VirtualFiles with S3 Compatible services To use `S3VirtualFiles` with another service provider with an S3 compatible service, the configuration of the `AmazonS3Client` will be different. For example, if we want to connect to [DigitalOcean Spaces](https://www.digitalocean.com/products/spaces) with `S3VirtualFiles` we would configure the `AmazonS3Client` with the following constructor. ```csharp var doSpacesClient = new AmazonS3Client("my-spaces-key", "my-spaces-secret", new AmazonS3Config { ServiceURL = "https://sfo3.digitaloceanspaces.com", }); var doSpacesVfs = new S3VirtualFiles(doSpacesClient, "my-spaces-name"); ``` The `ServiceURL` is a region specific URL rather than the URL to your Space directly, and your Space name is provided as a bucket name in the `S3VirtualFiles`. Other services might use the `AmazonS3Config` differently, but as long as the `AmazonS3Client` is configured correctly for the service you are using, the `S3VirtualFiles` can be used the same way. ## REST Files ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/apps/screenshots/restfiles.png) REST Files GitHub-like explorer is another example that was [rewritten to use ServiceStack's Virtual File System](https://github.com/ServiceStackApps/AwsApps/blob/master/src/AwsApps/restfiles/FilesService.cs) and now provides remote file management of an S3 Bucket behind a REST-ful API. ## AWS Email Contacts ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/apps/screenshots/emailcontacts.png) ### SqsMqServer The AWS Email Contacts example shows the same long-running [EmailContact Service](https://github.com/ServiceStackApps/AwsApps/blob/4817f5c6ad69defd74d528403bfdb03e5958b0b3/src/AwsApps/emailcontacts/EmailContactServices.cs#L81) being executed from both HTTP and MQ Server by just [changing which url the HTML Form is posted to](https://github.com/ServiceStackApps/AwsApps/blob/4817f5c6ad69defd74d528403bfdb03e5958b0b3/src/AwsApps/emailcontacts/default.cshtml#L203): ```html //html <form id="form-emailcontact" method="POST" action="@(new EmailContact().ToPostUrl())" data-action-alt="@(new EmailContact().ToOneWayUrl())"> ... <div> <input type="checkbox" id="chkAction" data-click="toggleAction" /> <label for="chkAction">Email via MQ</label> </div> ... </form> ``` > The urls are populated from a typed Request DTO using the [Reverse Routing Extension methods](/routing#reverse-routing) Checking the **Email via MQ** checkbox fires the JavaScript handler below that's registered as [declarative event in ss-utils.js](/ss-utils-js#declarative-events): ```js $(document).bindHandlers({ toggleAction: function() { var $form = $(this).closest("form"), action = $form.attr("action"); $form.attr("action", $form.data("action-alt")) .data("action-alt", action); } }); ``` The code to configure and start an SQS MQ Server is similar to [other MQ Servers](/messaging): ```csharp container.Register<IMessageService>(c => new SqsMqServer( AwsConfig.AwsAccessKey, AwsConfig.AwsSecretKey, RegionEndpoint.USEast1) { DisableBuffering = true, // Trade-off latency vs efficiency }); var mqServer = container.Resolve<IMessageService>(); mqServer.RegisterHandler<EmailContacts.EmailContact>(ExecuteMessage); mqServer.Start(); ``` When an MQ Server is registered, ServiceStack automatically publishes Requests accepted on the "One Way" [pre-defined route](/routing#pre-defined-routes) to the registered MQ broker. The message is later picked up and executed by a Message Handler on a background Thread. ## AWS Auth ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/apps/screenshots/awsauth.png) ### DynamoDbAuthRepository AWS Auth is an example showing how easy it is to enable multiple Auth Providers within the same App which allows Sign-Ins from Twitter, Facebook, GitHub, Google, Yahoo and LinkedIn OAuth providers, as well as HTTP Basic and Digest Auth and normal Registered User logins and Custom User Roles validation, all managed in DynamoDB Tables using the registered `DynamoDbAuthRepository` below: ```csharp container.Register<IAuthRepository>(new DynamoDbAuthRepository(db, initSchema:true)); ``` Standard registration code is used to configure the `AuthFeature` with all the different Auth Providers AWS Auth wants to support: ```csharp return new AuthFeature(() => new AuthUserSession(), new IAuthProvider[] { new CredentialsAuthProvider(), //HTML Form post of UserName/Password credentials new BasicAuthProvider(), //Sign-in with HTTP Basic Auth new DigestAuthProvider(AppSettings), //Sign-in with HTTP Digest Auth new TwitterAuthProvider(AppSettings), //Sign-in with Twitter new FacebookAuthProvider(AppSettings), //Sign-in with Facebook new YahooOpenIdOAuthProvider(AppSettings), //Sign-in with Yahoo OpenId new OpenIdOAuthProvider(AppSettings), //Sign-in with Custom OpenId new GoogleOAuth2Provider(AppSettings), //Sign-in with Google OAuth2 Provider new LinkedInOAuth2Provider(AppSettings), //Sign-in with LinkedIn OAuth2 Provider new GithubAuthProvider(AppSettings), //Sign-in with GitHub OAuth Provider }) { HtmlRedirect = "/awsauth/", //Redirect back to AWS Auth app after OAuth sign in IncludeRegistrationService = true, //Include ServiceStack's built-in RegisterService }; ``` ### DynamoDbAppSettings The AuthFeature looks for the OAuth settings for each AuthProvider in the registered [AppSettings](/appsettings), which for deployed **Release** builds gets them from multiple sources. Since `DynamoDbAppSettings` is registered first in a `MultiAppSettings` collection it checks entries in the DynamoDB `ConfigSetting` Table first before falling back to local [Web.config appSettings](https://github.com/ServiceStackApps/AwsApps/blob/4817f5c6ad69defd74d528403bfdb03e5958b0b3/src/AwsApps/Web.config#L15): ```csharp #if !DEBUG AppSettings = new MultiAppSettings( new DynamoDbAppSettings(new PocoDynamo(AwsConfig.CreateAmazonDynamoDb()), initSchema:true), new AppSettings()); // fallback to Web.confg #endif ``` Storing production config in DynamoDB reduces the effort for maintaining production settings decoupled from source code. The App Settings were populated in DynamoDB using [this simple script](https://github.com/ServiceStackApps/AwsApps/blob/9d4d3c3dfbf127ce0890d0984c264e8b440abd3f/src/AwsApps/AdminTasks.cs#L58) which imports its settings from a local [appsettings.txt file](/appsettings#textfilesettings): ```csharp var fileSettings = new TextFileSettings("~/../../deploy/appsettings.txt".MapHostAbsolutePath()); var dynamoSettings = new DynamoDbAppSettings(AwsConfig.CreatePocoDynamo()); dynamoSettings.InitSchema(); //dynamoSettings.Set("SmtpConfig", "{Username:REPLACE_USER,Password:REPLACE_PASS,Host:AWS_HOST,Port:587}"); foreach (var config in fileSettings.GetAll()) { dynamoSettings.Set(config.Key, config.Value); } ``` #### ConfigSettings Table in DynamoDB ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/release-notes/aws-configsettings.png) ## AWS Todos ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/apps/screenshots/todos.png) The [Backbone TODO App](http://todomvc.com/examples/backbone/) is a famous minimal example used as a "Hello, World" example to showcase and compare JavaScript client frameworks. The example also serves as a good illustration of the clean and minimal code it takes to build a simple CRUD Service utilizing a DynamoDB back-end with the new PocoDynamo client: ```csharp public class TodoService : Service { public IPocoDynamo Dynamo { get; set; } public object Get(Todo todo) { if (todo.Id != default(long)) return Dynamo.GetItem<Todo>(todo.Id); return Dynamo.GetAll<Todo>(); } public Todo Post(Todo todo) { Dynamo.PutItem(todo); return todo; } public Todo Put(Todo todo) { return Post(todo); } public void Delete(Todo todo) { Dynamo.DeleteItem<Todo>(todo.Id); } } ``` As it's a clean POCO, the `Todo` model can be also reused as-is throughout ServiceStack in Redis, OrmLite, Caching, Config, DTO's, etc: ```csharp public class Todo { [AutoIncrement] public long Id { get; set; } public string Content { get; set; } public int Order { get; set; } public bool Done { get; set; } } ``` ## [PocoDynamo](/aws-pocodynamo) PocoDynamo is a highly productive, feature-rich, typed .NET client which extends [ServiceStack's Simple POCO life](http://stackoverflow.com/a/32940275/85785) by enabling re-use of your code-first data models with Amazon's industrial strength and highly-scalable NoSQL [DynamoDB](https://aws.amazon.com/dynamodb/). ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/aws/pocodynamo/related-customer.png) #### First class support for reusable, code-first POCOs It works conceptually similar to ServiceStack's other code-first [OrmLite](/ormlite/) and [Redis](https://github.com/ServiceStack/ServiceStack.Redis) clients by providing a high-fidelity, managed client that enhances AWSSDK's low-level [IAmazonDynamoDB client](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/UsingAWSsdkForDotNet.html), with rich, native support for intuitively mapping your re-usable code-first POCO Data models into [DynamoDB Data Types](http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_Types.html). #### [AutoQuery DynamoDB](/aws-pocodynamo) Built on top of PocoDynamo, [AutoQuery Data's](/autoquery/data) `DynamoDbSource` provides the most productive development experience for effortlessly creating rich, queryable and optimized Services for DynamoDB data stores using only a typed Request DTO. ### PocoDynamo Features #### Advanced idiomatic .NET client PocoDynamo provides an idiomatic API that leverages .NET advanced language features with streaming API's returning `IEnumerable<T>` lazily evaluated responses that transparently performs multi-paged requests behind-the-scenes as the result set is iterated. It high-level API's provides a clean lightweight adapter to transparently map between .NET built-in data types and DynamoDB's low-level attribute values. Its efficient batched API's take advantage of DynamoDB's `BatchWriteItem` and `BatchGetItem` batch operations to perform the minimum number of requests required to implement each API. #### Typed, LINQ provider for Query and Scan Operations PocoDynamo also provides rich, typed LINQ-like querying support for constructing DynamoDB Query and Scan operations, dramatically reducing the effort to query DynamoDB, enhancing readability whilst benefiting from Type safety in .NET. #### Declarative Tables and Indexes Behind the scenes DynamoDB is built on a dynamic schema which whilst open and flexible, can be cumbersome to work with directly in typed languages like C#. PocoDynamo bridges the gap and lets your app bind to impl-free and declarative POCO data models that provide an ideal high-level abstraction for your business logic, hiding a lot of the complexity of working with DynamoDB - dramatically reducing the code and effort required whilst increasing the readability and maintainability of your Apps business logic. It includes optimal support for defining simple local indexes which only require declaratively annotating properties to index with an `[Index]` attribute. Typed POCO Data Models can be used to define more complex Local and Global DynamoDB Indexes by implementing `IGlobalIndex<Poco>` or `ILocalIndex<Poco>` interfaces which PocoDynamo uses along with the POCOs class structure to construct Table indexes at the same time it creates the tables. In this way the Type is used as a DSL to define DynamoDB indexes where the definition of the index is decoupled from the imperative code required to create and query it, reducing the effort to create them whilst improving the visualization and understanding of your DynamoDB architecture which can be inferred at a glance from the POCO's Type definition. PocoDynamo also includes first-class support for constructing and querying Global and Local Indexes using a familiar, typed LINQ provider. #### Resilient Each operation is called within a managed execution which transparently absorbs the variance in cloud services reliability with automatic retries of temporary errors, using an exponential backoff as recommended by Amazon. #### Enhances existing APIs PocoDynamo API's are a lightweight layer modeled after DynamoDB API's making it predictable the DynamoDB operations each API calls under the hood, retaining your existing knowledge investment in DynamoDB. When more flexibility is needed you can access the low-level `AmazonDynamoDBClient` from the `IPocoDynamo.DynamoDb` property and talk with it directly. Whilst PocoDynamo doesn't save you for needing to learn DynamoDB, its deep integration with .NET and rich support for POCO's smoothes out the impedance mismatches to enable an type-safe, idiomatic, productive development experience. #### High-level features PocoDynamo includes its own high-level features to improve the re-usability of your POCO models and the development experience of working with DynamoDB with support for Auto Incrementing sequences, Query expression builders, auto escaping and converting of Reserved Words to placeholder values, configurable converters, scoped client configurations, related items, conventions, aliases, dep-free data annotation attributes and more. ### Download PocoDynamo is contained in ServiceStack's AWS NuGet package: :::copy `<PackageReference Include="ServiceStack.Aws" Version="10.*" />` ::: > PocoDynamo has a 10 Tables [free-quota usage](https://servicestack.net/download#free-quotas) limit which is unlocked with a [license key](https://servicestack.net/pricing). To get started we'll need to create an instance of `AmazonDynamoDBClient` with your AWS credentials and Region info: ```csharp var awsDb = new AmazonDynamoDBClient(AWS_ACCESS_KEY, AWS_SECRET_KEY, RegionEndpoint.USEast1); ``` Then to create a PocoDynamo client pass the configured AmazonDynamoDBClient instance above: ```csharp var db = new PocoDynamo(awsDb); ``` ::: info Clients are Thread-Safe so you can register them as a singleton and share the same instance throughout your App ::: ### Creating a Table with PocoDynamo PocoDynamo enables a declarative code-first approach where it's able to create DynamoDB Table schemas from just your POCO class definition. Whilst you could call `db.CreateTable<Todo>()` API and create the Table directly, the recommended approach is instead to register all the tables your App uses with PocoDynamo on Startup, then just call `InitSchema()` which will go through and create all missing tables: ```csharp //PocoDynamo var db = new PocoDynamo(awsDb) .RegisterTable<Todo>(); db.InitSchema(); db.GetTableNames().PrintDump(); ``` In this way your App ends up in the same state with all tables created if it was started with **no tables**, **all tables** or only a **partial list** of tables. After the tables are created we query DynamoDB to dump its entire list of Tables, which if you started with an empty DynamoDB instance would print the single **Todo** table name to the Console: ```js [ Todo ] ``` ### Managed DynamoDB Client Every request in PocoDynamo is invoked inside a managed execution where any temporary errors are retried using the [AWS recommended retries exponential backoff](http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/ErrorHandling.html#APIRetries). All PocoDynamo API's returning `IEnumerable<T>` returns a lazy evaluated stream which behind-the-scenes sends multiple paged requests as needed whilst the sequence is being iterated. As LINQ APIs are also lazily evaluated you could use `Take()` to only download however the exact number results you need. So you can query the first 100 table names with: ```csharp //PocoDynamo var first100TableNames = db.GetTableNames().Take(100).ToList(); ``` and PocoDynamo will only make the minimum number of requests required to fetch the first 100 results. ## PocoDynamo Examples ### [DynamoDbCacheClient](/aws#managed-dynamodb-client) We've been quick to benefit from the productivity advantages of PocoDynamo ourselves where we've used it to rewrite [DynamoDbCacheClient](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbCacheClient.cs) which is now just 2/3 the size and much easier to maintain than the existing [Community-contributed version](https://github.com/ServiceStack/ServiceStack/blob/22aca105d39997a8ea4c9dc20b242f78e07f36e0/src/ServiceStack.Caching.AwsDynamoDb/DynamoDbCacheClient.cs) whilst at the same time extending it with even more functionality where it now implements the `ICacheClientExtended` API. ### [DynamoDbAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbAuthRepository.cs) PocoDynamo's code-first Typed API made it much easier to implement value-added DynamoDB functionality like the [DynamoDbAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Aws/src/ServiceStack.Aws/DynamoDb/DynamoDbAuthRepository.cs) which due sharing a similar code-first POCO approach to OrmLite, ended up being a straight-forward port of the existing [OrmLiteAuthRepository](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Server/Auth/OrmLiteAuthRepository.cs) where it was able to reuse the existing `UserAuth` and `UserAuthDetails` POCO data models. ### [DynamoDbTests](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack.Aws/tests/ServiceStack.Aws.DynamoDbTests) Despite its young age we've added a comprehensive test suite behind PocoDynamo which has become our exclusive client for developing DynamoDB-powered Apps. ### [PocoDynamo Docs](/aws-pocodynamo) This only scratches the surface of what PocoDynamo can do, comprehensive documentation is available in the [PocoDynamo project](/aws-pocodynamo) explaining how it compares to DynamoDB's AWSSDK client, how to use it to store related data, how to query indexes and how to use its rich LINQ querying functionality to query DynamoDB. ## [Getting started with AWS + ServiceStack Guides](https://github.com/ServiceStackApps/AwsGettingStarted) Amazon offers managed hosting for a number of RDBMS and Caching servers which ServiceStack provides first-class clients for. We've provided a number of guides to walk through setting up these services from your AWS account and connect to them with ServiceStack's typed .NET clients. ### [AWS RDS PostgreSQL and OrmLite](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/postgres-guide.md) [![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-postgres-powered-by-aws.png)](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/postgres-guide.md) ### [AWS RDS Aurora and OrmLite](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/aurora-guide.md) [![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-aurora-powered-by-aws.png)](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/aurora-guide.md) ### [AWS RDS MySQL and OrmLite](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/mysql-guide.md) [![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-mysql-powered-by-aws.png)](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/mssql-guide.md) ### [AWS RDS MariaDB and OrmLite](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/mariadb-guide.md) [![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-mariadb-powered-by-aws.png)](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/mariadb-guide.md) ### [AWS RDS SQL Server and OrmLite](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/mssql-guide.md) [![](https://github.com/ServiceStack/Assets/raw/master/img/aws/rds-sqlserver-powered-by-aws.png)](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/mssql-guide.md) ### [AWS ElastiCache Redis and ServiceStack](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/redis-guide.md) [![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticache-redis-powered-by-aws.png)](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/redis-guide.md) ### [AWS ElastiCache Redis and ServiceStack](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/memcached-guide.md) [![](https://github.com/ServiceStack/Assets/raw/master/img/aws/elasticache-memcached-powered-by-aws.png)](https://github.com/ServiceStackApps/AwsGettingStarted/blob/master/docs/memcached-guide.md) The source code used in each guide is also available in the [AwsGettingStarted](https://github.com/ServiceStackApps/AwsGettingStarted) repo. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/azure-service-bus-mq.md --- title: Azure Service Bus MQ --- ## Enable in an existing Web App Use the `servicebus` mixin to register an [MQ Server](/messaging) for Azure Service Bus with an existing .NET App: :::sh npx add-in servicebus ::: ## Worker Service Template To start using Azure Service Bus in stand-alone MQ Servers (i.e. without HTTP access) is to run the MQ Server in an ASP.NET Core Worker Service by starting from a pre-configured project template: <worker-templates template="worker-servicebus"></worker-templates> ## Manual Configuration Support for registering Azure Service Bus as an [MQ Server](/messaging) in ServiceStack is available in [ServiceStack.Azure](https://www.nuget.org/packages/ServiceStack.Azure) NuGet package: :::copy `<PackageReference Include="ServiceStack.Azure" Version="10.*" />` ::: Once installed ServiceBus can be configured the same way as any other [MQ Servers](/messaging), by first registering the ServiceBus `IMessageService` provider followed by registering all ServiceStack Services you want to be able to invoke via MQ’s: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureMq))] namespace MyApp; public class ConfigureMq : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddSingleton<IMessageService>(c => new ServiceBusMqServer(context.Configuration.GetConnectionString("ServiceBus"))); }) .ConfigureAppHost(afterAppHostInit: appHost => { var mqServer = appHost.Resolve<IMessageService>().Start(); // Register MQ endpoints for APIs mqServer.RegisterHandler<MyRequest>(ExecuteMessage); mqServer.Start(); }); } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/azure.md --- slug: azure title: Azure Resources --- ## ServiceStack.Azure ServiceStack.Azure package provides support to Azure ServiceBus and Azure Blob Storage. All features are incapsulated in single ServiceStack.Azure package. To install package run from NuGet :::copy `<PackageReference Include="ServiceStack.Azure" Version="10.*" />` ::: ServiceStack.Azure includes implementation of the following ServiceStack providers: - [ServiceBusMqServer](#ServiceBusMqServer) - [MQ Server](/messaging) for invoking ServiceStack Services via Azure ServiceBus - [AzureBlobVirtualFiles](#virtual-filesystem-backed-by-azure-blob-storage) - Virtual file system based on Azure Blob Storage - [AzureTableCacheClient](#caching-support-with-azure-table-storage) - Cache client over Azure Table Storage ### ServiceBusMqServer The code to configure and start an ServiceBus MQ Server is similar to other MQ Servers: ```csharp container.Register<IMessageService>(c => new ServiceBusMqServer(ConnectionString)); var mqServer = container.Resolve<IMessageService>(); mqServer.RegisterHandler<ServiceDto>(ExecuteMessage); AfterInitCallbacks.Add(appHost => mqServer.Start()); ``` Where ConnectionString is connection string to Service Bus, how to obtain it from Azure Portal you can find in [Get Started with Service Bus queues](https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dotnet-get-started-with-queues) article When an MQ Server is registered, ServiceStack automatically publishes Requests accepted on the "One Way" pre-defined route to the registered MQ broker. The message is later picked up and executed by a Message Handler on a background Thread. ## Virtual FileSystem backed by Azure Blob Storage You can use an Azure Blob Storage Container to serve website content with the **AzureBlobVirtualFiles**. ```csharp public class AppHost : AppHostBase { public override void Configure(Container container) { //All Razor Views, Markdown Content, imgs, js, css, etc are served from an Azure Blob Storage container //Use connection string to Azure Storage Emulator. For real application you should use connection string //to your Azure Storage account var azureBlobConnectionString = "UseDevelopmentStorage=true"; //Azure container which hold your files. If it does not exist it will be automatically created. var containerName = "myazurecontainer"; VirtualFiles = new AzureBlobVirtualFiles(connectionString, containerName); AddVirtualFileSources.Add(VirtualFiles); } } ``` ## Caching support with Azure Table Storage The AzureTableCacheClient implements [ICacheClientExteded](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Caching/ICacheClientExtended.cs) and [IRemoveByPattern](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Caching/IRemoveByPattern.cs) using Azure Table Storage. ```csharp public class AppHost : AppHostBase { public override void Configure(Container container) { string cacheConnStr = "UseDevelopmentStorage=true;"; container.Register<ICacheClient>(new AzureTableCacheClient(cacheConnStr)); } } ``` ### Deploying to Azure See [Rockwind.Azure](https://github.com/sharp-apps/rockwind-azure) for a working configuration and step-by-step guide to deploy .NET Core Web Apps to Azure using Docker. # Community Resources - [Using the Azure Cache With ServiceStack](http://blog.emmanuelnelson.com/post/33303196083/using-the-azure-cache-with-service-stack) by [@emmanuelnelson](http://emmanuelnelson.com/about-me) - [Securing ServiceStack using Azure Authentication Library and WPF Client](http://dhickey-ie-archive.azurewebsites.net/post/2012/12/12/Securing-ServiceStack-using-Azure-Authentication-Library.aspx) by [@randompunter](http://twitter.com/randompunter) - [ServiceStack.Azure](https://github.com/ServiceStack/ServiceStack.Azure), supporting VirtualPathProvider backed by Azure Blob Storage, and ICacheProvider backed by Azure Table Storage --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/background-jobs.md --- title: Background Jobs --- ServiceStack.Jobs is our solution for queueing and managing background jobs and scheduled tasks in .NET 10 Apps. It's a easy to use library that seamlessly integrates into existing ServiceStack Apps with a built-in Management UI to provide real-time monitoring, inspection and management of background jobs. :::youtube 2Cza_a_rrjA Durable Background Jobs and Scheduled Tasks for .NET 10 Apps ::: ### Durable and Infrastructure-Free Prior to Background Jobs we've been using [Background MQ](/background-mq) for executing our background tasks which lets you queue any Request DTO to execute its API in a background worker. It's been our preferred choice as it didn't require any infrastructure dependencies since its concurrent queues are maintained in memory, this also meant they were non-durable that didn't survive across App restarts. Whilst [ServiceStack MQ](/messaging) enables an additional endpoint for your APIs our main use-case for using it was for executing background tasks which would be better suited by purpose-specific software designed for the task. #### SQLite Persistence It uses SQLite as the backing store for its durability since it's low latency, [fast disk persistence](https://www.sqlite.org/fasterthanfs.html) and embeddable file-based database makes it ideally suited for the task which allows creation of naturally partition-able and archivable monthly databases on-the-fly without any maintenance overhead or infrastructure dependencies making it easy to add to any .NET App without impacting or adding increased load to their existing configured databases. ### Queue APIs or Commands For even greater reuse you're able to queue your existing ServiceStack APIs as a Background Job in addition to [Commands](/commands) added in the [last v8.3 release](/releases/v8_03) for encapsulating units of logic into internal invokable, inspectable and auto-retryable building blocks. ### Real Time Admin UI The Background Jobs Admin UI provides a real time view into the status of all background jobs including their progress, completion times, Executed, Failed and Cancelled Jobs, etc. which is useful for monitoring and debugging purposes. ![](/img/pages/jobs/jobs-dashboard.webp) View Real-time progress of queued Jobs ![](/img/pages/jobs/jobs-queue.webp) View real-time progress logs of executing Jobs ![](/img/pages/jobs/jobs-logs.webp) View Job Summary and Monthly Databases of Completed and Failed Jobs ![](/img/pages/jobs/jobs-completed.webp) View full state and execution history of each Job ![](/img/pages/jobs/jobs-failed.webp) Cancel Running jobs and Requeue failed jobs ### Feature Overview Despite being a v1 release it packs all the features we wanted to use in a Background Jobs solution including: - No infrastructure dependencies - Monthly archivable rolling Databases with full Job Execution History - Execute existing APIs or versatile Commands - Commands auto registered in IOC - Scheduled Reoccurring Tasks - Track Last Job Run - Serially execute jobs with the same named Worker - Queue Jobs dependent on successful completion of parent Job - Queue Jobs to be executed after a specified Date - Execute Jobs within the context of an Authenticated User - Auto retry failed jobs on a default or per-job limit - Timeout Jobs on a default or per-job limit - Cancellable Jobs - Requeue Failed Jobs - Execute custom callbacks on successful execution of Job - Maintain Status, Logs and Progress of Executing Jobs - Execute transitive (i.e. non-durable) jobs using named workers - Attach optional `Tag`, `BatchId`, `CreatedBy`, `ReplyTo` and `Args` with Jobs Please [let us know](https://servicestack.net/ideas) if there are any other missing features you would love to see implemented. ## Install As it's more versatile and better suited, we've replaced the usage of Background MQ with ServiceStack.Jobs in all **.NET 10 Identity Auth Templates** for sending Identity Auth Confirmation Emails when SMTP is enabled. So the easiest way to get started with ServiceStack.Jobs is to [create a new Identity Auth Project](https://servicestack.net/start), e.g: :::sh npx create-net blazor-vue MyApp ::: ### Exiting .NET 10 Templates Existing .NET 10 Projects can configure their app to use **ServiceStack.Jobs** by mixing in: :::sh npx add-in jobs ::: Which adds the `Configure.BackgroundJobs.cs` [Modular Startup](https://docs.servicestack.net/modular-startup) configuration and a **ServiceStack.Jobs** NuGet package reference to your project. ## Usage Any API, Controller or Minimal API can execute jobs with the `IBackgroundJobs` dependency, e.g. here's how you can run a background job to send a new email when an API is called in any new Identity Auth template: ```csharp class MyService(IBackgroundJobs jobs) : Service { public object Any(MyOrder request) { var jobRef = jobs.EnqueueCommand<SendEmailCommand>(new SendEmail { To = "my@email.com", Subject = $"Received New Order {request.Id}", BodyText = $""" Order Details: {request.OrderDetails.DumptTable()} """, }); //... } } ``` Which records and immediately executes a worker to execute the `SendEmailCommand` with the specified `SendEmail` Request argument. It also returns a reference to a Job which can be used later to query and track execution of a job. Alternatively a `SendEmail` API could be executed with just the Request DTO: ```csharp var jobRef = jobs.EnqueueApi(new SendEmail { To = "my@email.com", Subject = $"Received New Order {request.Id}", BodyText = $""" Order Details: {request.OrderDetails.DumptTable()} """, }); ``` Although Sending Emails is typically not an API you want to make externally available and would want to either [Restrict access](/auth/restricting-services) or [limit usage to specified users](/auth/identity-auth#declarative-validation-attributes). In both cases the `SendEmail` Request is persisted into the Jobs SQLite database for durability that gets updated as it progresses through the queue. For execution the API or command is resolved from the IOC before being invoked with the Request. APIs are executed via the [MQ Request Pipeline](/order-of-operations) and commands executed using the [Commands Feature](/commands) where it will be also visible in the [Commands Admin UI](/commands#command-admin-ui). ::include jobs-shared.md:: ::include command-types.md:: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/background-mq.md --- slug: background-mq title: Background MQ Service --- The `BackgroundMqService` is a full-featured `IMessageService` implementation that provides the functionality of [distributed MQ Server](/messaging) but doesn't require any infrastructure dependencies. It's ideal for **queueing long-running background tasks** by publishing Request DTOs, control execution throughput by creating different sized Thread Pools per message type, inspect the status and statistics of different MQ Workers, stop and restart processing messages, etc. It's a complete implementation implementing the same [MQ Message flow](/messaging#message-workflow) and passes the existing MQ Test suites so you'll be able to substitute it for any of the other MQ Servers. But it still doesn't persist messages across App restarts so we recommend using it in combination with persistence to an external data source - generally a good idea for tracking the status of long-running jobs. To illustrate an example we'll walkthrough TechStacks implementation of what's likely the most popular use of background job in Web Apps - sending emails... ## Using Background Service to send Emails Configuring the `BackgroundMqService` is the same as every other MQ Server, i.e. register it in the IOC and register handlers for the Request DTO of each Service you want to be able to run in the background: ```csharp container.Register<IMessageService>(c => new BackgroundMqService()); var mqServer = container.Resolve<IMessageService>(); mqServer.RegisterHandler<SendNotification>(ExecuteMessage, 4); mqServer.RegisterHandler<SendSystemEmail>(ExecuteMessage); AfterInitCallbacks.Add(host => { mqServer.Start(); ExecuteService(new RetryPendingNotifications()); }); ``` The one difference is that we also register an `AfterInitCallbacks` to Execute the [RetryPendingNotifications](https://github.com/NetCoreApps/TechStacks/blob/c89920d92e1e11a5495bf88a45fea60aea9d199e/src/TechStacks.ServiceInterface/Admin/NotificationServices.cs#L51) Service after the AppHost has started. We'll look at the implementation later, but it's for re-queueing any incomplete Background Jobs that failed to complete. With the handlers registered, any Service can queue any of these Services to Execute in the background by publishing a populated Request DTO of that Type. One place where TechStacks does this is to notify all subscribers when someone creates a post, which it does by [calling SendNotificationAsync()](https://github.com/NetCoreApps/TechStacks/blob/973eecdc334687e13008aa9f07444e7c6affcfd9/src/TechStacks.ServiceInterface/PostServices.cs#L62): ```csharp await SendNotificationAsync(nameof(CreatePost), nameof(Post), id); ``` A common API that inserts an entry in the `Notification` table and publishes a `SendNotification` message to have the Service executed in the background by 1 of the 4 MQ Workers configured at Startup: ```csharp public async Task SendNotificationAsync(string eventName, string refType, long refId) { var notificationId = await Db.InsertAsync(ToNotification(eventName, refType, refId), selectIdentity:true); PublishMessage(new SendNotification { Id = notificationId }); } Notification ToNotification(string eventName, string refType, long refId) => new Notification { Event = eventName, RefId = refId, RefType = refType, RefUrn = $"urn:{refType}:{refId}", Created = DateTime.Now, }; ``` `SendNotification` is a regular ServiceStack Service except we only want it accessible to Admin Users so it's annotated with `[ExcludeMetadata]` to hide it from the public metadata services. ```csharp [ExcludeMetadata] [Route("/notifications/{Id}/send")] public class SendNotification : IReturnVoid { public long Id { get; set; } } ``` For the complete reference [NotificationServices.cs](https://github.com/NetCoreApps/TechStacks/blob/master/src/TechStacks.ServiceInterface/Admin/NotificationServices.cs) contains all the background Email Services and bespoke code to send the different Email types whilst [NotificationServices.Utils.cs](https://github.com/NetCoreApps/TechStacks/blob/master/src/TechStacks.ServiceInterface/Admin/NotificationServices.Utils.cs) contains reusable functionality shared by the different email implementations. The `SendNotification` Service sends a different Email based on the Notification Event Type which are all executed within the same managed implementation below where it takes care of marking the completion of the notification, either with the time it successfully completed or the Exception the notification it failed with: ```csharp [RequiredRole("Admin")] public partial class NotificationServices : Service { private static ILog log = LogManager.GetLogger(typeof(NotificationServices)); Func<Notification, Task> GetEventHandler(string eventName) { switch (eventName) { case nameof(CreatePost): return SendNewPostEmail; case nameof(UserPostReport): return SendReportPostEmail; case nameof(UserPostCommentReport): return SendReportCommentEmail; } return null; } public async Task Any(SendNotification request) { var notification = AssertNotification(request.Id); var eventHandler = GetEventHandler(notification.Event); if (eventHandler != null) { try { await eventHandler(notification); await Db.UpdateOnlyAsync(() => new Notification { Completed = DateTime.Now }, where: x => x.Id == notification.Id); } catch (Exception ex) { await Db.UpdateOnlyAsync(() => new Notification { Failed = DateTime.Now, Error = ex.Message + Environment.NewLine + ex }, where:x => x.Id == notification.Id); throw; } } else { log.Warn($"Received notification of unknown Event Type: {notification.Event}"); } } } ``` The creation of Email Template is split into different steps to ensure all users are sent the same rendered Email snapshot, even if the task failed midway through and had to be replayed. Each template follows the same approach: - Work out all users the email should be sent to - Retrieve all data required by the template and inject it into a new [ServiceStack ScriptContext](https://sharpscript.net/docs/installation) - Use the context to render the specified [email template](https://github.com/NetCoreApps/TechStacks/tree/master/src/TechStacks/emails). In this case it renders the [post-new.html](https://github.com/NetCoreApps/TechStacks/blob/master/src/TechStacks/emails/post-new.html) Template inside the [_layout.html](https://github.com/NetCoreApps/TechStacks/blob/master/src/TechStacks/emails/_layout.html) - which is based on the [Email Bootstrap Template](https://github.com/seanpowell/Email-Boilerplate/blob/master/email_commentsremoved.html) and used as the layout for all email templates. ```csharp private async Task SendNewPostEmail(Notification notification) { EmailTemplate template = null; if (notification.EmailTemplateId == null) { var post = await AssertPost(notification.RefId); var org = await Db.SingleByIdAsync<Organization>(post.OrganizationId); var user = await Db.SingleByIdAsync<CustomUserAuth>(post.UserId); var q = Db.From<OrganizationSubscription>() .Where(x => x.OrganizationId == post.OrganizationId) .And("ARRAY[{0}] && post_types", post.Type) .Select(x => x.UserId); var postTypeSubscriberUserIds = await Db.ColumnAsync<int>(q); var context = CreateEmailTemplateContext(); var templatePath = "emails/post-new"; var page = context.GetPage(templatePath); var result = new PageResult(page) { Args = { ["baseUrl"] = AppSettings.GetString("PublicBaseUrl"), ["post"] = post, ["organization"] = org, } }; template = await CreateAndSaveEmailTemplate(notification, nameof(SendNewPostEmail), templatePath, toUserIds: postTypeSubscriberUserIds, fromName: user.DisplayName ?? user.UserName, ccName: org.Name + " Subscribed", subject: $"[{post.Type}] {post.Title}", html: await result.RenderToStringAsync()); } else { template = await Db.SingleByIdAsync<EmailTemplate>(notification.EmailTemplateId); } await SendEmailsToRemainingUsers(notification, template); } ``` The end result of each email is to create an entry in the generic [EmailTemplate](https://github.com/NetCoreApps/TechStacks/blob/master/src/TechStacks.ServiceInterface/DataModel/EmailTemplate.cs) table with the rendered email to send and all users to send it to. It's then handed to the managed `SendEmailsToRemainingUsers` routine to send the emails. The final step is to send the email to all designated users, which is ultimately done by the [EmailProvider](https://github.com/NetCoreApps/TechStacks/blob/master/src/TechStacks.ServiceInterface/Notifications/EmailProvider.cs) which uses an `SmtpClient` to send the Email to the AWS SES endpoint. To handle cases where the long-running process can fail at any point, the email template keeps a record of each user that emails were sent to by updating the `emailed_user_ids` PostgreSQL Array after each email is sent. So if the `SendNotification` message is replayed it will start back where it left off and only sends emails to the remaining users. ```csharp private async Task SendEmailsToRemainingUsers(Notification notification, EmailTemplate template) { var remainingUserIds = notification.UserIds.Where(x => !notification.EmailedUserIds.Contains(x)).ToList(); if (remainingUserIds.Count > 0) { var users = await Db.SelectAsync<UserEmailInfo>(Db.From<CustomUserAuth>() .Where(x => remainingUserIds.Contains(x.Id))); var userMap = users.ToDictionary(x => x.Id); foreach (var userId in remainingUserIds) { var user = userMap[userId]; if (!string.IsNullOrEmpty(user.Email)) { Email.Send(template.ToEmailMessage(user.Email, user.DisplayName ?? user.UserName)); } await RecordEmailSentToUser(notification.Id, userId); } } else { SendNotificationEmail(template, $"{notification.UserIds.Length} subscribers"); } } private void SendNotificationEmail(EmailTemplate template, string toName) { var notificationsEmail = AppSettings.GetString("NotificationsFromEmail"); var email = template.ToEmailMessage(notificationsEmail, toName); Email.Send(email); } private async Task RecordEmailSentToUser(long notificationId, int userId) { await Db.ExecuteSqlAsync(@"UPDATE notification SET emailed_user_ids = emailed_user_ids || @userId WHERE id = @id", new { userId, id = notificationId }); } ``` ## Replaying Messages The `RetryPendingNotifications` Service replays incomplete notifications by publishing new `SendNotification` messages which are executed by the `BackgroundMqService` as normal. This also lets you replay failed notifications by setting `Failed` to `null` and recalling the Service. As the state of each task is persisted after each step, it can fail at any point and the replayed task will be able to restart where it left off. ```csharp public object Any(RetryPendingNotifications request) { var pendingNotificationIds = Db.Column<long>(Db.From<Notification>() .Where(x => x.Completed == null && x.Failed == null) .Select(x => x.Id)) .ToArray(); if (pendingNotificationIds.Length > 0) { log.Info($"Resending {pendingNotificationIds.Length} pending notifications: {pendingNotificationIds}"); foreach (var notificationId in pendingNotificationIds) { PublishMessage(new SendNotification { Id = notificationId }); } } return new RetryPendingNotificationsResponse { ResentIds = pendingNotificationIds }; } ``` ## MQ Status The other benefit from persisting the status of each tasks is being able to inspect the `Notification` and `EmailTemplate` table to be able to monitor the progress of each Task. We can also call the [IMessageService](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Messaging/IMessageService.cs) APIs to inspect the state of the Background MQ Service. We can use the Service below to make the APIs accessible remotely: ```csharp [Route("/mq/stop")] // Stop the Background Service and all MQ Workers from processing more messages public class MqStop : IReturn<string> {} [Route("/mq/start")] // Start the Background Service and process any queued messages public class MqStart : IReturn<string> {} [Route("/mq/stats")] public class MqStats : IReturn<string> {} [Route("/mq/status")] public class MqStatus : IReturn<string> {} public class BackgroundAdminServices : Service { public IMessageService MqService { get; set; } [RequiredRole("Admin")] public object Any(MqStart request) { MqService.Start(); return "OK"; } [RequiredRole("Admin")] public object Any(MqStop request) { MqService.Stop(); return "OK"; } public object Any(MqStats request) => MqService.GetStats(); [AddHeader(ContentType = MimeTypes.PlainText)] public object Any(MqStatus request) => MqService.GetStatsDescription(); } ``` This lets you can call [/mq/stats](https://techstacks.io/mq/stats.json) to view a summary of **all messages processed** since the last time the App was restarted and [/mq/status](https://techstacks.io/mq/status) to view **all Queues** the Background Service is currently listening to and the **statistics of each individual MQ worker**. Here's a snapshot of what this looks like for TechStacks with 4 threads listening to `SendNotification` messages and 1 thread listening to `SendSystemEmail`: ``` # MQ SERVER STATS: STATUS: Started LISTENING ON: mq:SendNotification.inq mq:SendNotification.inq mq:SendNotification.inq mq:SendNotification.inq mq:SendSystemEmail.inq ------------------------------ # COLLECTIONS: ------------------------------ INFO SendNotification: STATS: Thread Count: 4 Total Messages Added: 27 Total Messages Taken: 0 Total .outq Messages: 27 Total .dlq Messages: 0 QUEUES: mq:SendNotification.inq: 0 message(s) mq:SendNotification.priorityq: 0 message(s) mq:SendNotification.dlq: 0 message(s) mq:SendNotification.outq: 27 message(s) ------------------------------ INFO SendSystemEmail: STATS: Thread Count: 1 Total Messages Added: 1 Total Messages Taken: 0 Total .outq Messages: 1 Total .dlq Messages: 0 QUEUES: mq:SendSystemEmail.inq: 0 message(s) mq:SendSystemEmail.priorityq: 0 message(s) mq:SendSystemEmail.dlq: 0 message(s) mq:SendSystemEmail.outq: 1 message(s) ------------------------------ # WORKERS: ------------------------------ WORKER 1 on mq:SendNotification.inq STATS for SendNotification: TotalNormalMessagesReceived: 7 TotalPriorityMessagesReceived: 0 TotalProcessed: 7 TotalRetries: 0 TotalFailed: 0 LastMessageProcessed: 4/9/18 7:44:49 PM ------------------------------ WORKER 2 on mq:SendNotification.inq STATS for SendNotification: TotalNormalMessagesReceived: 7 TotalPriorityMessagesReceived: 0 TotalProcessed: 7 TotalRetries: 0 TotalFailed: 0 LastMessageProcessed: 4/9/18 7:49:17 PM ------------------------------ WORKER 3 on mq:SendNotification.inq STATS for SendNotification: TotalNormalMessagesReceived: 7 TotalPriorityMessagesReceived: 0 TotalProcessed: 7 TotalRetries: 0 TotalFailed: 0 LastMessageProcessed: 4/9/18 8:28:59 PM ------------------------------ WORKER 4 on mq:SendNotification.inq STATS for SendNotification: TotalNormalMessagesReceived: 6 TotalPriorityMessagesReceived: 0 TotalProcessed: 6 TotalRetries: 0 TotalFailed: 0 LastMessageProcessed: 4/9/18 7:41:18 PM ------------------------------ WORKER 5 on mq:SendSystemEmail.inq STATS for SendSystemEmail: TotalNormalMessagesReceived: 1 TotalPriorityMessagesReceived: 0 TotalProcessed: 1 TotalRetries: 0 TotalFailed: 0 LastMessageProcessed: 4/9/18 7:44:47 PM ------------------------------ ``` ### MQ Collection Stats You can also get info on the Queue Collection for a specific DTO Type with: ```csharp var bgService = (BackgroundMqService)HostContext.Resolve<IMessageService>(); var mqCollection = bgService.GetCollection(typeof(Poco)); Dictionary<string, long> statsMap = mqCollection.GetDescriptionMap(); ``` Which returns the text info that [mqCollection.GetDescription()](/background-mq#mq-status) returns, but in a structured Dictionary using the keys: - `ThreadCount` - `TotalMessagesAdded` - `TotalMessagesTaken` - `TotalOutQMessagesAdded` - `TotalDlQMessagesAdded` The dictionary also includes each the snapshot counts of each queue in the MQ Collection, e.g: - `mq:Poco.inq` - `mq:Poco.priorityq` - `mq:Poco.outq` - `mq:Poco.dlq` You can also get the Stats of each MQ Worker, or if you have multiple workers for a Request Type you can access them with: ```csharp IMqWorker[] workers = bgService.GetWorkers(QueueNames<Type>.In); List<IMessageHandlerStats> stats = workers.Map(x => x.GetStats()); ``` Then combine them to get their cumulative result: ```csharp IMessageHandlerStats combinedStats = stats.CombineStats(); ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/templates/blazor-bootstrap.md --- title: Blazor WASM Bootstrap --- <div class="not-prose hide-title my-8 ml-20 flex flex-col items-center"> <div> <svg class="w-44 h-44 text-purple-600 mr-8" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15Z"/></svg> </div> <h1 class="text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-4xl">Blazor WASM Bootstrap Template</h1> </div> The feature-rich Blazor WASM Bootstrap template is ideal for teams with strong C# skills building Line Of Business (LOB) applications. Utilizing Blazor WebAssembly (WASM) with a ServiceStack backend yields an optimal frictionless [API First development model](/api-first-development) where UIs can bind directly to Typed DTOs whilst benefiting from ServiceStack's [structured error handling](/validation) & rich contextual form validation binding. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="TIgjMf_vtCI" style="background-image: url('https://img.youtube.com/vi/TIgjMf_vtCI/maxresdefault.jpg')"></lite-youtube> ## Getting Started Customize and Download a new Blazor WASM Bootstrap project with your preferred project name: <h3 class="text-center">Download new C# Blazor WASM Project</h3> <blazor-templates class="not-prose pb-8"></blazor-templates> Alternatively you can create & download a new Blazor Project with the [x dotnet tool](/dotnet-new): :::sh npx create-net LegacyTemplates/blazor-wasm ProjectName ::: ## Optimal Development Workflow By utilizing ServiceStack's [decoupled project structure](/physical-project-structure), combined with Blazor enabling C# on the client, we're able to get 100% reuse of your APIs shared DTOs as-is to enable an end-to-end Typed API automatically free from any additional tooling or code-gen complexity. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="BcQqCzm4tK0" style="background-image: url('https://img.youtube.com/vi/BcQqCzm4tK0/maxresdefault.jpg')"></lite-youtube> ## Api and ApiAsync methods .NET was originally conceived to use Exceptions for error control flow however there's been a tendency in modern languages & libraries to shun Exceptions and return errors as normal values, an approach we believe is a more flexible & ergonomic way to handle API responses. ### The ApiResult way The `Api(Request)` and `ApiAsync(Request)` APIs returns a typed `ApiResult<Response>` Value Result encapsulating either a Typed Response or a structured API Error populated in `ResponseStatus` allowing you to handle API responses programmatically without `try/catch` handling: The below example code to create a new Booking: ```csharp CreateBooking request = new(); ApiResult<IdResponse> api = new(); async Task OnSubmit() { api = await Client.ApiAsync(request); if (api.Succeeded) { await done.InvokeAsync(api.Response!); request = new(); } } ``` Which despite its terseness handles both **success** and **error** API responses, **if successful** it invokes the `done()` callback notifying its parent of the new Booking API Response before resetting the Form's data model with a new Request DTO. Upon **failure** the error response is populated in `api.Error` which binds to the UI via Blazor's `<CascadingValue Value=@api.Error>` to propagate it to all its child components in order to show contextual validation errors next to their respective Input controls. ## JSON API Client The recommended way for configuring a Service Client to use in your Blazor WASM Apps is to use `AddBlazorApiClient()`, e.g: ```csharp builder.Services.AddBlazorApiClient(builder.Configuration["ApiBaseUrl"] ?? builder.HostEnvironment.BaseAddress); ``` Which registers a typed Http Client factory returning a recommended pre-configured `JsonApiClient` to communicate with your back-end ServiceStack APIs including support for CORS, required when hosting the decoupled UI on a different server (e.g. CDN) to your server. If you're deploying your Blazor WASM UI to a CDN you'll need to specify the URL for the server, otherwise if it's deployed together with your Server App you can use the Host's Base Address. ### Public Pages & Components To reduce boiler plate, your Blazor Pages & components can inherit the templates local [AppComponentBase.cs](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/AppComponentBase.cs) which inherits `BlazorComponentBase` which gets injected with the `JsonApiClient` and provides short-hand access to its most common APIs: ```csharp public class BlazorComponentBase : ComponentBase, IHasJsonApiClient { [Inject] public JsonApiClient? Client { get; set; } public virtual Task<ApiResult<TResponse>> ApiAsync<TResponse>(IReturn<TResponse> request) => Client!.ApiAsync(this, request); public virtual Task<ApiResult<EmptyResponse>> ApiAsync(IReturnVoid request) => Client!.ApiAsync(this, request); public virtual Task<TResponse> SendAsync<TResponse>(IReturn<TResponse> request) => Client!.SendAsync(this, request); } ``` ### Protected Pages & Components Pages and Components requiring Authentication should inherit from [AppAuthComponentBase](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/AppComponentBase.cs) instead which integrates with Blazor's Authentication Model to provide access to the currently authenticated user: ```csharp public abstract class AppAuthComponentBase : AppComponentBase { [CascadingParameter] protected Task<AuthenticationState>? AuthenticationStateTask { get; set; } protected bool HasInit { get; set; } protected bool IsAuthenticated => User?.Identity?.IsAuthenticated ?? false; protected ClaimsPrincipal? User { get; set; } protected override async Task OnParametersSetAsync() { var state = await AuthenticationStateTask!; User = state.User; HasInit = true; } } ``` ## Benefits of Shared DTOs Typically with Web Apps, our client is using a different language to C#, so an equivalent request DTOs need to be generated for the client. ### TypeScript Example For example, TypeScript generated DTOs still give us typed end-to-end services with the help of tooling like [Add ServiceStack Reference](/add-servicestack-reference) ```csharp [Route("/hello/{Name}")] public class Hello : IReturn<HelloResponse> { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` Turns into: ```typescript // @Route("/hello/{Name}") export class Hello implements IReturn<HelloResponse> { public name: string; public constructor(init?: Partial<Hello>) { (Object as any).assign(this, init); } public getTypeName() { return 'Hello'; } public getMethod() { return 'POST'; } public createResponse() { return new HelloResponse(); } } export class HelloResponse { public result: string; public responseStatus: ResponseStatus; public constructor(init?: Partial<HelloResponse>) { (Object as any).assign(this, init); } } ``` When Request or Response DTOs changes during development, the client DTOs need to be regenerated using a command like [`x csharp`](./add-servicestack-reference.md#simple-command-line-utilities). ### Blazor WASM Example Developing your Blazor WASM UI however, you just change your shared request/response DTO in the shared `ServiceModel` project, and both your client and server compile against the same request/response DTO classes. This eliminates the need for any additional step. In the `ServiceModel` project, we still have: ```csharp [Route("/hello/{Name}")] public class Hello : IReturn<HelloResponse> { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` Which the Blazor C# App can use directly in its **.razor** pages: ```csharp @code { Hello request = new() { Name = "Blazor WASM" }; ApiResult<HelloResponse> api = new(); protected override async Task OnInitializedAsync() => await submit(); async Task submit() => api = await ApiAsync(request); } ``` ## ServiceStack.Blazor The **ServiceStack.Blazor** library contains integrated functionality for Blazor including an optimal JSON API HttpClient Factory, API-enabled base components, HTML Utils and Bootstrap & Tailwind UI Input components heavily utilized throughout the template. ### Built-in Blazor and Tailwind UI Components The Built-in UI Components enable a clean & productive dev model and share the same base classes making them functionally equivalent and can be swapped when switching CSS frameworks by updating its namespace in your projects `_Imports.razor` The Blazor Components in **ServiceStack.Blazor** include: | Component | Description | |-------------------|-----------------------------------------------------------------------------------| | `<TextInput>` | Text Input control for string properties | | `<DateTimeInput>` | Date Input control for Date properties | | `<CheckboxInput>` | Checkbox Input control for Boolean properties | | `<SelectInput>` | Select Dropdown for properties with finite list of values like Enums | | `<TextAreaInput>` | Text Input control for large strings | | `<DynamicInput>` | Dynamic component utilizing the appropriate above Input controls in Auto Forms | | `<AlertSuccess>` | Displaying successful notification feedback | | `<ErrorSummary>` | Displaying error summary message when no contextual field validation is available | | `<FileUpload>` | Used with `FilesUploadFeature` and `UploadTo` attribute to upload files | ::: info All Input controls support contextual validation of ServiceStack's existing [structured Error responses](/error-handling) ::: ### Themable Should it be needed, all components are themable by running the included [README.ss](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Shared/Components/README.ss) executable documentation that copies its **Razor** UI markup locally into your project enabling customization of UI including controls. ### Bookings CRUD The C# Service Client `Api*` methods make calling remote ServiceStack APIs similar to calling a C# method as its returned `ApiResult<Response>` encapsulates both a typed **Error** & API **Response** as an alternate way to handle errors as all components can bind directly to its `api.Error`. The reusability extends to your APIs typed Request DTOs which components can directly 2-way data bind to. Below is an example of a CRUD Booking form [BookingsCrud/Create.razor](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Blazor.Tests/Client/Pages/BookingsCrud/Create.razor) used to Create Bookings: ```html @attribute [Authorize(Roles="Employee")] @inherits AppAuthComponentBase <form @onsubmit="_ => OnSubmit()" @onsubmit:preventDefault class=@CssUtils.ClassNames("relative shadow rounded p-4",@class)> <CascadingValue Value=@api.Error> <button type="button" class="close" @onclick="close"><i></i></button> <h1 class="fs-4 text-secondary text-center"> New Booking </h1> <ErrorSummary Except=@VisibleFields /> <div class="mb-3 form-floating"> <TextInput @bind-Value="request.Name" required placeholder="Name for this booking" /> </div> <div class="mb-3 form-floating"> <SelectInput @bind-Value="request.RoomType" Options=@(Enum.GetValues<RoomType>()) /> </div> <div class="d-flex"> <div class="mb-3 flex-fill form-floating me-1"> <TextInput type="number" @bind-Value="request.RoomNumber" min="0" required /> </div> <div class="mb-3 flex-fill form-floating"> <TextInput type="number" @bind-Value="request.Cost" min="0" required /> </div> </div> <div class="d-flex"> <div class="mb-3 flex-fill form-floating me-1"> <DateTimeInput @bind-Value="request.BookingStartDate" required /> </div> <div class="mb-3 flex-fill form-floating"> <DateTimeInput @bind-Value="request.BookingEndDate" /> </div> </div> <div class="mb-3 form-floating"> <TextAreaInput @bind-Value="request.Notes" placeholder="Notes about this booking" style="height:6rem" /> </div> <div class="d-flex justify-content-between align-items-center"> <div> <button type="submit" class="btn btn-primary">Create Booking</button> </div> </div> </CascadingValue> </form> @code { [Parameter] public EventCallback<IdResponse> done { get; set; } [Parameter] public string? @class { get; set; } CreateBooking request = new() { BookingStartDate = DateTime.UtcNow, }; // Hide Error Summary Messages for Visible Fields which displays contextual validation errors string[] VisibleFields => new[] { nameof(request.Name), nameof(request.RoomType), nameof(request.RoomNumber), nameof(request.BookingStartDate), nameof(request.BookingEndDate), nameof(request.Cost), nameof(request.Notes), }; ApiResult<IdResponse> api = new(); async Task OnSubmit() { api = await ApiAsync(request); if (api.Succeeded) { await done.InvokeAsync(api.Response!); request = new(); } } async Task close() => await done.InvokeAsync(null); } ``` Which binds directly to the [CreateBooking](https://github.com/NetCoreTemplates/blazor-vue/blob/main/MyApp.ServiceModel/Bookings.cs) Request DTO: ```csharp [Tag("bookings"), Description("Create a new Booking")] [Route("/bookings", "POST")] [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditCreate)] public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { [Description("Name this Booking is for"), ValidateNotEmpty] public string Name { get; set; } public RoomType RoomType { get; set; } [ValidateGreaterThan(0)] public int RoomNumber { get; set; } [ValidateGreaterThan(0)] public decimal Cost { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [Input(Type = "textarea")] public string? Notes { get; set; } } ``` To initially render this form: <a class="flex flex-col justify-center items-center my-8" href="https://vue-static.web-templates.io/bookings-crud"> <img src="/img/pages/jamstack/blazor-wasm/booking-new.png" class="max-w-screen-md" /> </a> Whose `[ValidateNotEmpty]` [declarative validator](/declarative-validation) filters down to the **client Input** UI to prevent unnecessary invalid API requests: <a class="flex flex-col justify-center items-center my-8" href="https://vue-static.web-templates.io/bookings-crud"> <img src="/img/pages/jamstack/blazor-wasm/booking-new-validation-client.png" class="max-w-screen-md" /> </a> Validation of server error responses looks like: <a class="flex flex-col justify-center items-center my-8" href="https://vue-static.web-templates.io/bookings-crud"> <img src="/img/pages/jamstack/blazor-wasm/booking-new-validation-server.png" class="max-w-screen-md" /> </a> #### Use of `form` Validation errors use the standard `<form>` element and `api.Error` responses that can be passed directly to each control or in this case uses Blazor's built-in `<CascadingValue Value=@api.Error>` to cascade it to child components controls that can make use of it to display contextual errors. #### Integrated Auth It uses the templates included [AppComponentBase](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/AppComponentBase.cs) which integrates with Blazor's Auth model allowing it to use its standard `[Authorize(Roles="Employee")]` attributes and provides access to the Authenticated User's info populated from Secure; HttpOnly JWT Cookies for secure stateless client Authentication that works across App deployments and without any server infrastructure. Public pages can inherit `AppComponentBase` to access ServiceStack.Blazor's [BlazorComponentBase](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Blazor/BlazorComponentBase.cs) and get access to `JsonApiClient` dependency and related functionality. #### AutoQuery CRUD Example `CreateBooking` is an [AutoQuery CRUD](/autoquery/crud) Request DTO, it works without a server implementation as it uses the default (and [overridable](/autoquery/crud#custom-autoquery-crud-services)) implementation generated by AutoQuery. ## Blazor Trade-offs Blazor WASM enables reuse of C# skills, tooling & libraries offers a compelling advantage for .NET teams, so much so it has become our preferred technology for developing internal LOB applications as it's better able to reuse existing C# investments in an integrated SPA Framework utilizing a single toolchain. It does however come at a cost of a larger initial download size and performance cost resulting in a high Time-To-First-Render (TTFR) and an overall poor initial User Experience when served over the Internet, that's further exacerbated over low speed Mobile connections. This is likely an acceptable trade-off for most LOB applications served over high-speed local networks but may not be a suitable choice for public Internet sites _(an area our other [jamstacks.net](https://jamstacks.net) templates may serve better)_. As an SPA it also suffers from poor SEO as content isn't included in the initial page and needs to be rendered in the browser after the App has initialized. For some content heavy sites this can be a deal-breaker either requiring proxy rules so content pages are served by a different SEO friendly site or otherwise prohibits using Blazor WASM entirely. ### Improving Startup Performance The solution to both issues is fairly straightforward, by utilizing the mainstay solution behind [Jamstack Frameworks](https://jamstack.org/generators/) and prerender content at build time. We know what needs to be done, but how best to do it in Blazor WASM? Unfortunately the [official Blazor WASM prerendering guide](https://docs.microsoft.com/en-us/aspnet/core/blazor/components/prerendering-and-integration?view=aspnetcore-6.0&pivots=webassembly) isn't actually a prerendering solution, as is typically used to describe [Static Site Generators (SSG)](https://www.netlify.com/blog/2020/04/14/what-is-a-static-site-generator-and-3-ways-to-find-the-best-one/) prerendering static content at build-time, whilst Blazor WASM prerendering docs instead describes a [Server-Side-Rendering (SSR)](https://www.omnisci.com/technical-glossary/server-side-rendering) solution mandating the additional complexity of maintaining your Apps dependencies in both client and server projects. Unfortunately this approach also wont yield an optimal result since prerendering is typically used so Apps can host their SSG content on static file hosts, instead SSR does the opposite whose forced runtime coupling to the .NET Server Host prohibits Blazor WASM Apps from being served from a CDN. As this defeats [many of the benefits](https://blazor.web-templates.io/docs/hosting) of a Blazor WASM Jamstack App in the first place, we've instead opted for a more optimal solution that doesn't compromise its CDN hostability. ### Increasing Perceived Performance We have little opportunity over improving the startup time of the real C# Blazor App beyond hosting its static assets on CDN edge caches, but ultimately what matters is [perceived performance](https://marvelapp.com/blog/a-designers-guide-to-perceived-performance/) which we do have control over given the screen for a default Blazor WASM project is a glaring white screen flash: ![](/img/pages/jamstack/blazor-wasm/loading-default.png) The longer users have to wait looking at this black loading screen without signs of progress, the more they'll associate your site with taking forever to load. One technique many popular sites are using to increase perceived performance is to use content placeholders in place of real-content which gives the impression that the site has almost loaded and only requires a few moments more for the latest live data to be slotted in. As an example here's what YouTube content placeholders mimicking the page layout looks like before the real site has loaded: ![](/img/pages/jamstack/youtube-placeholder.png) But we can do even better than an inert content placeholder, and load a temporary chrome of our App. But as this needs to be done before Blazor has loaded we need to implement this with a sprinkling of HTML + JS. First thing we need to do is move the scoped styles of our Apps [MainLayout](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Shared/MainLayout.razor) and [NavMenu](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Shared/NavMenu.razor) into an external [main-layout.css](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/wwwroot/css/main-layout.css) so our temp App chrome can use it. Then in our [/wwwroot/index.html](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/wwwroot/index.html) anything between `<div id="app"></div>` is displayed whilst our Blazor App is loading, before it's replaced with the real App. So Here we just paste in the **MainLayout** markup: ```html <div id="app"> <!-- loading: render temp static app chrome to improve perceived performance --> <div id="app-loading" class="main-layout page"> <div class="sidebar"> <div class="top-row navbar navbar-dark"> <a class="navbar-brand ps-4" href="/">MyApp</a> <button class="navbar-toggler"> <span class="navbar-toggler-icon"></span> </button> </div> <div class="collapse"> <ul class="nav flex-column"></ul> </div> </div> <div class="main"> <div class="main-top-row px-4"> <ul class="nav nav-pills"></ul> <a href="signin?return=docs/deploy" class="btn btn-outline-primary"> Login </a> </div> <div class="content px-4"> <!--PAGE--> <div class="spinner-border float-start mt-2 me-2" role="status"> <span class="sr-only"></span> </div> <h1 style="font-size:36px"> Loading... </h1> <!--/PAGE--> </div> </div> </div> </div> ``` Less our App's navigation menus which we'll dynamically generate with the splash of JS below: ```js const SIDEBAR = ` Home,home,/$ Counter,plus,/counter Todos,clipboard,/todomvc Bookings CRUD,calendar,/bookings-crud Call Hello,transfer,/hello$ Call HelloSecure,shield,/hello-secure Fetch data,list-rich,/fetchdata Admin,lock-locked,/admin Login,account-login,/signin ` const TOP = ` 0.40 /mo,dollar,/docs/hosting Prerendering,loop-circular,/docs/prerender Deployments,cloud-upload,/docs/deploy ` const path = location.pathname const NAV = ({ label, cls, icon, route, exact }) => `<li class="nav-item${cls}"> <a href="${route}" class="nav-link${(exact ? path==route : path.startsWith(route)) ? ' active' : ''}"> <span class="oi oi-${icon}" aria-hidden="true"></span> ${label} </a></li>` const renderNav = (csv,f) => csv.trim().split(/\r?\n/g).map(s => NAV(f.apply(null,s.split(',')))).join('') const $1 = s => document.querySelector(s) $1('#app-loading .sidebar .nav').innerHTML = renderNav(SIDEBAR, (label, icon, route) => ({ label, cls: ` px-3${route == SIDEBAR[0].route ? ' pt-3' : ''}`, icon, route: route.replace(/\$$/, ''), exact: route.endsWith('$') })) $1('#app-loading .main-top-row .nav').innerHTML = renderNav(TOP, (label, icon, route) => ({ label, cls: '', icon, route: route.replace(/\$$/, ''), exact: route.endsWith('$') })) ``` Which takes care of both rendering the top and sidebar menus as well as highlighting the active menu for the active nav item being loaded, and because we're rendering our real App navigation with real links, users will be able to navigate to the page they want before our App has loaded. So you can distinguish a prerendered page from a Blazor rendered page we've added a **subtle box shadow** to prerendered content which you'll see initially before being reverting to a flat border when the Blazor App takes over and replaces the entire page: ```html <style> #app-loading .content { box-shadow: inset 0 4px 4px 0 rgb(0 0 0 / 0.05) } </style> ``` With just this, every page now benefits from an instant App chrome to give the perception that our App has loaded instantly before any C# in our Blazor App is run. E.g. here's what the [Blazor Counter](https://blazor.web-templates.io/counter) page looks like while it's loading: ![](/img/pages/jamstack/blazor-wasm/loading.png) If you click refresh the [/counter](https://blazor.web-templates.io/counter) page a few times you'll see the new loading screen prior to the Counter page being available. Our App is now in a pretty shippable state with decent UX of a loading page that looks like it loaded instantly instead of the "under construction" Loading... page from the default Blazor WASM project template. It's not quite a zero maintenance solution but still fairly low maintenance as only the `SIDEBAR` and `TOP` csv lists need updating when add/removing menu items. ### Improving UX with Prerendering We'll now turn our focus to the most important page in our App, the [Home Page](https://blazor.web-templates.io) which is the page most people will see when loading the App from the first time. With the above temp App chrome already in place, a simple generic pre-rendering solution to be able to load any prerendered page is to check if any prerendered content exists in the [/prerender](https://github.com/LegacyTemplates/blazor-wasm/tree/gh-pages/prerender) folder for the current path, then if it does replace the default index.html `Loading...` page with it: ```js const pagePath = path.endsWith('/') ? path.substring(0, path.length - 2) + '/index.html' : path fetch(`/prerender${pagePath}`) .then(r => r.text()) .then(html => { if (html.indexOf('<!DOCTYPE html>') >= 0) return // ignore CDN 404.html const pageBody = $1('#app-loading .content') if (pageBody) pageBody.innerHTML = `<i hidden data-prerender="${path}"></i>` + html }) .catch(/* no prerendered content found for this path */) ``` We also tag which path the prerendered content is for and provide a JS function to fetch the prerendered content which we'll need to access later in our App: ```html <script> /* Loading */ window.prerenderedPage = function () { const el = document.querySelector('#app-loading .content') return el && el.innerHTML || '' } </script> ``` We now have a solution in place to load pre-rendered content from the `/prerender` folder, but still need some way of generating it. The solution is technology independent in that you can you use any solution your most comfortable with, (even manually construct each prerendered page if preferred), although it's less maintenance if you automate and get your CI to regenerate it when it publishes your App. Which ever tool you choose would also need to be installed in your CI/GitHub Action if that's where it's run, so we've opted for a dependency-free & least invasive solution by utilizing the existing Tests project, which has both great IDE tooling support and can easily be run from the command-line and importantly is supported by the [bUnit](https://bunit.dev) testing library which we'll be using to render component fragments in isolation. To distinguish prerendering tasks from our other Tests we've tagged [PrerenderTasks.cs](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Tests/PrerenderTasks.cs) with the `prerender` Test category. The only configuration the tasks require is the location of the `ClientDir` WASM Project defined in [appsettings.json](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Tests/appsettings.json) that's setup in the constructor. The `Render<T>()` method renders the Blazor Page inside a `Bunit.TestContext` which it saves at the location specified by its `@page` directive. ```csharp [TestFixture, Category("prerender")] public class PrerenderTasks { Bunit.TestContext Context; string ClientDir; string WwrootDir => ClientDir.CombineWith("wwwroot"); string PrerenderDir => WwrootDir.CombineWith("prerender"); public PrerenderTasks() { Context = new(); var config = new ConfigurationBuilder().AddJsonFile("appsettings.json").Build(); ClientDir = config[nameof(ClientDir)] ?? throw new Exception($"{nameof(ClientDir)} not defined in appsettings.json"); FileSystemVirtualFiles.RecreateDirectory(PrerenderDir); } void Render<T>(params ComponentParameter[] parameters) where T : IComponent { WriteLine($"Rendering: {typeof(T).FullName}..."); var component = Context.RenderComponent<T>(parameters); var route = typeof(T).GetCustomAttribute<RouteAttribute>()?.Template; if (string.IsNullOrEmpty(route)) throw new Exception($"Couldn't infer @page for component {typeof(T).Name}"); var fileName = route.EndsWith("/") ? route + "index.html" : $"{route}.html"; var writeTo = Path.GetFullPath(PrerenderDir.CombineWith(fileName)); WriteLine($"Written to {writeTo}"); File.WriteAllText(writeTo, component.Markup); } [Test] public void PrerenderPages() { Render<Client.Pages.Index>(); // Add Pages to prerender... } } ``` Being a unit test gives it a number of different ways it can be run, using any of the NUnit test runners, from the GUI integrated in C# IDEs or via command-line test runners like `dotnet test` which can be done with: ```bash $ dotnet test --filter TestCategory=prerender ``` To have CI automatically run it when it creates a production build of our App we'll add it to our Host `.csproj`: ```xml <PropertyGroup> <TestsDir>$(MSBuildProjectDirectory)/../MyApp.Tests</TestsDir> </PropertyGroup> <Target Name="AppTasks" AfterTargets="Build" Condition="$(APP_TASKS) != ''"> <CallTarget Targets="Prerender" Condition="$(APP_TASKS.Contains('prerender'))" /> </Target> <Target Name="Prerender"> <Exec Command="dotnet test --filter TestCategory=prerender --logger:"console;verbosity=detailed"" WorkingDirectory="$(TestsDir)" /> </Target> ``` Which allows [GitHub Actions to run it](https://github.com/LegacyTemplates/blazor-wasm/blob/9460ebf57d3e46af1680eb3a2ff5080e59d33a54/.github/workflows/release.yml#L80) when it publishes the App with: ```bash $ dotnet publish -c Release /p:APP_TASKS=prerender ``` Now when we next commit code, the GitHub CI Action will run the above task to generate our [/prerender/index.html](https://github.com/LegacyTemplates/blazor-wasm/blob/gh-pages/prerender/index.html) page that now loads our [Home Page](https://blazor.web-templates.io) instantly! [![](/img/pages/jamstack/blazor-wasm/home-prerendered.png)](/) The only issue now is that the default Blazor template behavior will yank our pre-rendered page, once during loading and another during Authorization. To stop the unwanted yanking we've updated the [`<Loading/>`](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Shared/Loading.razor) component to instead load the prerendered page content if it's **for the current path**: ```html @inject IJSRuntime JsRuntime @inject NavigationManager NavigationManager @if (!string.IsNullOrEmpty(prerenderedHtml)) { @((MarkupString)prerenderedHtml) } else { <div class=@CssUtils.ClassNames("spinner-border float-start mt-2 me-2", @class)> <span class="sr-only"></span> </div> <h1 style="font-size:36px"> Loading... </h1> } @code { [Parameter] public string Message { get; set; } = "Loading..."; [Parameter] public string @class { get; set; } = ""; public string prerenderedHtml { get; set; } = ""; protected override async Task OnInitializedAsync() { var html = await JsRuntime.InvokeAsync<string>("prerenderedPage") ?? ""; var currentPath = new Uri(NavigationManager.Uri).AbsolutePath; if (html.IndexOf($"data-prerender=\"{currentPath}\"") >= 0) prerenderedHtml = html; } } ``` Whilst to prevent yanking by the Authorization component we'll also include the current page when rendering the alternate layout with an `Authenticating...` text that will appear under the Login/Logout buttons on the top-right: ```xml <AuthorizeRouteView RouteData="@routeData" DefaultLayout="@typeof(MainLayout)"> <Authorizing> <p class="text-muted" style="float:right;margin:1rem 1rem 0 0">Authenticating...</p> <RouteView RouteData="@routeData" /> </Authorizing> </AuthorizeRouteView> ``` This last change brings us to the optimal UX we have now with the home page loading instantly whilst our Blazor App is loading in the background that'll eventually replace the home page with its identical looking C# version except for the **box-shadow under the top navigation** so you can tell when you're looking at the pre-rendered version instead of the C# Blazor version. ### Prerendering Markdown Content The other pages that would greatly benefit from prerendering are the Markdown `/docs/*` pages (like this one) that's implemented in [Docs.razor](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Pages/Docs.razor). However to enable SEO friendly content our `fetch(/prerender/*)` solution isn't good enough as the initial page download needs to contain the prerendered content, i.e. instead of being downloaded in after. ### PrerenderMarkdown Task To do this our `PrerenderMarkdown` Task scans all `*.md` pages in the [content](https://github.com/LegacyTemplates/blazor-wasm/tree/main/MyApp.Client/wwwroot/content) directory and uses the same [/MyApp.Client/MarkdownUtils.cs](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/MarkdownUtils.cs) implementation [Docs.razor](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Pages/Docs.razor) uses to generate the markdown and embeds it into the `index.html` loading page to generate the pre-rendered page: ```csharp [Test] public async Task PrerenderMarkdown() { var srcDir = WwrootDir.CombineWith("content").Replace('\\', '/'); var dstDir = WwrootDir.CombineWith("docs").Replace('\\', '/'); var indexPage = PageTemplate.Create(WwrootDir.CombineWith("index.html")); if (!Directory.Exists(srcDir)) throw new Exception($"{Path.GetFullPath(srcDir)} does not exist"); FileSystemVirtualFiles.RecreateDirectory(dstDir); foreach (var file in new DirectoryInfo(srcDir).GetFiles("*.md", SearchOption.AllDirectories)) { WriteLine($"Converting {file.FullName} ..."); var name = file.Name.WithoutExtension(); var docRender = await Client.MarkdownUtils.LoadDocumentAsync(name, doc => Task.FromResult(File.ReadAllText(file.FullName))); if (docRender.Failed) { WriteLine($"Failed: {docRender.ErrorMessage}"); continue; } var dirName = dstDir.IndexOf("wwwroot") >= 0 ? dstDir.LastRightPart("wwwroot").Replace('\\', '/') : new DirectoryInfo(dstDir).Name; var path = dirName.CombineWith(name == "index" ? "" : name); var mdBody = @$" <div class=""prose lg:prose-xl min-vh-100 m-3"" data-prerender=""{path}""> <div class=""markdown-body""> {docRender.Response!.Preview!} </div> </div>"; var prerenderedPage = indexPage.Render(mdBody); string htmlPath = Path.GetFullPath(Path.Combine(dstDir, $"{name}.html")); File.WriteAllText(htmlPath, prerenderedPage); WriteLine($"Written to {htmlPath}"); } } public class PageTemplate { string? Header { get; set; } string? Footer { get; set; } public PageTemplate(string? header, string? footer) { Header = header; Footer = footer; } public static PageTemplate Create(string indexPath) { if (!File.Exists(indexPath)) throw new Exception($"{Path.GetFullPath(indexPath)} does not exist"); string? header = null; string? footer = null; var sb = new StringBuilder(); foreach (var line in File.ReadAllLines(indexPath)) { if (header == null) { if (line.Contains("<!--PAGE-->")) { header = sb.ToString(); // capture up to start page marker sb.Clear(); } else sb.AppendLine(line); } else { if (sb.Length == 0) { if (line.Contains("<!--/PAGE-->")) // discard up to end page marker { sb.AppendLine(); continue; } } else sb.AppendLine(line); } } footer = sb.ToString(); if (string.IsNullOrEmpty(header) || string.IsNullOrEmpty(footer)) throw new Exception($"Parsing {indexPath} failed, missing <!--PAGE-->...<!--/PAGE--> markers"); return new PageTemplate(header, footer); } public string Render(string body) => Header + body + Footer; } ``` Whilst the `wwwroot/index.html` is parsed with `PageTemplate` above who uses the resulting layout to generate pages within `<!--PAGE--><!--/PAGE-->` markers. After it's also executed by the same MSBuild task run by GitHub Actions it prerenders all `/wwwroot/content/*.md` pages which are written to the [/wwwroot/docs/*.html](https://github.com/LegacyTemplates/blazor-wasm/tree/gh-pages/docs) folder. This results in the path to the pre-generated markdown docs i.e. [/docs/prerender](https://github.com/LegacyTemplates/blazor-tailwind/blob/main/MyApp.Client/wwwroot/content/prerender.md) having the **exact same path** as its route in the Blazor App, which when exists, CDNs give priority to over the SPA fallback the Blazor App is loaded with. It shares similar behavior as the home page where its pre-rendered content is initially loaded before it's replaced with the C# version once the Blazor App loads. The difference is that it prerenders "complete pages" for better SEO & TTFR. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/templates/blazor-components.md --- title: Blazor Tailwind Components --- ServiceStack.Blazor high-productivity components enable rapid development in Blazor Server and WASM Apps: <div class="my-8 flex justify-center"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="iKpQI2233nY" style="background-image: url('https://img.youtube.com/vi/iKpQI2233nY/maxresdefault.jpg')"></lite-youtube> </div> <div id="blazor-component-gallery" class="not-prose mt-16 relative bg-white dark:bg-black py-4"> <div class="mx-auto max-w-md px-4 text-center sm:max-w-3xl sm:px-6 lg:max-w-7xl lg:px-8"> <p class="mt-2 text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-4xl">Blazor Gallery</p> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Discover ServiceStack.Blazor Rich UI Components and Integrated Features </p> </div> </div> [![](/img/pages/blazor/gallery-splash.png)](https://blazor-gallery.servicestack.net) As our components support both hosting models we're maintaining identical Gallery sites running on both **Blazor Server** and **WASM**: <div class="not-prose mb-16 mx-auto mt-5 max-w-md sm:flex sm:justify-center md:mt-8"> <div class="rounded-md shadow"> <a href="https://blazor-gallery.servicestack.net" class="flex w-full items-center justify-center rounded-md border border-transparent bg-indigo-600 px-8 py-3 text-base font-medium text-white hover:bg-indigo-700 md:py-4 md:px-10 md:text-lg hover:no-underline"> Blazor Server </a> </div> <div class="mt-3 rounded-md shadow sm:mt-0 sm:ml-3"> <a href="https://blazor-gallery.jamstacks.net" class="flex w-full items-center justify-center rounded-md border border-transparent bg-white px-8 py-3 text-base font-medium text-indigo-600 hover:bg-gray-50 md:py-4 md:px-10 md:text-lg hover:no-underline"> Blazor WASM </a> </div> </div> For a closer look at ServiceStack.Blazor Components in action, download & run them to see how good they'll run in your Environment: <div class="flex flex-col"> <a href="https://github.com/NetCoreApps/BlazorGallery" class="flex text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorGallery</span> </a> <a href="https://github.com/NetCoreApps/BlazorGalleryWasm" class="flex mt-2 text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorGalleryWasm</span> </a> </div> <div class="not-prose pt-8 my-8 ml-20 flex flex-col items-center"> <div> <svg class="w-40 h-40 text-gray-800 mr-8" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M9.37 5.51A7.35 7.35 0 0 0 9.1 7.5c0 4.08 3.32 7.4 7.4 7.4c.68 0 1.35-.09 1.99-.27A7.014 7.014 0 0 1 12 19c-3.86 0-7-3.14-7-7c0-2.93 1.81-5.45 4.37-6.49zM12 3a9 9 0 1 0 9 9c0-.46-.04-.92-.1-1.36a5.389 5.389 0 0 1-4.4 2.26a5.403 5.403 0 0 1-3.14-9.8c-.44-.06-.9-.1-1.36-.1z"/></svg> </div> <h2 id="darkmode" class="border-none text-4xl sm:text-5xl md:text-6xl tracking-tight font-extrabold"> <span class="text-gray-800 mr-6">Dark Mode</span> </h2> </div> All ServiceStack.Blazor components take advantage of Tailwind DarkMode support to include full support for Dark Mode. ![](/img/pages/blazor/dark-and-light-mode.png) [Tailwind](https://tailwindcss.com) has revolutionized how we style our Web Apps with its [mobile first](https://tailwindcss.com/#mobile-first) design system that's dramatically simplified creating maintainable responsive Web Apps. It also excels at adding support for [Dark Mode](https://tailwindcss.com/#dark-mode) with its first-class **dark:** modifier allowing the use of standard tailwind classes to specify what elements should look like when viewed in **Dark Mode**, e.g: <div class="not-prose relative bg-slate-50 rounded-xl overflow-hidden dark:bg-slate-800/25"><div style="background-position:10px 10px" class="absolute inset-0 bg-grid-slate-100 [mask-image:linear-gradient(0deg,#fff,rgba(255,255,255,0.6))] dark:bg-grid-slate-700/25 dark:[mask-image:linear-gradient(0deg,rgba(255,255,255,0.1),rgba(255,255,255,0.5))]"></div><div class="relative rounded-xl overflow-auto"> <div class="grid grid-cols-1 sm:grid-cols-2"> <div class="p-8 pt-7"> <p class="mb-2 text-sm font-medium text-slate-500">Light mode</p> <div class="bg-white rounded-lg px-6 py-8 ring-1 ring-slate-900/5 shadow-xl"> <div> <span class="inline-flex items-center justify-center p-2 bg-indigo-500 rounded-md shadow-lg"> <svg class="h-6 w-6 text-white" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke="currentColor" aria-hidden="true"> <path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z"></path></svg> </span> </div> <h3 class="mt-5 text-base font-medium text-slate-900 tracking-tight">Writes Upside-Down</h3> <p class="mt-2 text-sm text-slate-500"> The Zero Gravity Pen can be used to write in any orientation, including upside-down. It even works in outer space. </p> </div> </div> <div class="bg-slate-900 p-8 pt-7"> <p class="mb-2 text-sm font-medium text-slate-400">Dark mode</p> <div class="bg-slate-800 rounded-lg px-6 py-8 ring-1 ring-slate-900/5 shadow-xl"> <div> <span class="inline-flex items-center justify-center p-2 bg-indigo-500 rounded-md shadow-lg"> <svg class="h-6 w-6 text-white" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke="currentColor" aria-hidden="true"> <path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z"></path></svg> </span> </div> <h3 class="mt-5 text-base font-medium text-white tracking-tight">Writes Upside-Down</h3> <p class="mt-2 text-sm text-slate-400"> The Zero Gravity Pen can be used to write in any orientation, including upside-down. It even works in outer space. </p> </div> </div> </div> </div><div class="absolute inset-0 pointer-events-none border border-black/5 rounded-xl dark:border-white/5"></div></div> ```html <div class="bg-white dark:bg-slate-800 rounded-lg px-6 py-8 ring-1 ring-slate-900/5 shadow-xl"> <div> <span class="inline-flex items-center justify-center p-2 bg-indigo-500 rounded-md shadow-lg"> <svg class="h-6 w-6 text-white" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke="currentColor" aria-hidden="true"><!-- ... --></svg> </span> </div> <h3 class="text-slate-900 dark:text-white mt-5 text-base font-medium tracking-tight">Writes Upside-Down</h3> <p class="text-slate-500 dark:text-slate-400 mt-2 text-sm"> The Zero Gravity Pen can be used to write in any orientation, including upside-down. It even works in outer space. </p> </div> ``` ### View ServiceStack.Blazor in Dark Mode We're happy to announce that Dark Mode support is included in **all ServiceStack.Blazor components** and all Blazor Tailwind project templates where you'll be able to toggle on/off Dark Mode with the new `<DarkModeToggle>` component. Checkout this video to see how beautiful Dark Mode looks like in the latest ServiceStack.Blazor Components and Tailwind project templates: <div class="my-8 flex justify-center"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="8nwpC_B4AC4" style="background-image: url('https://img.youtube.com/vi/8nwpC_B4AC4/maxresdefault.jpg')"></lite-youtube> </div> For a more interactive view, use the right Dark Mode toggle to turn on/off Dark Mode in the embedded [Blazor Gallery Contacts Page](https://blazor-gallery.jamstacks.net/grid/contacts-meta): <iframe src="https://blazor-gallery.jamstacks.net/grid/contacts-meta/embed" class="w-full border-none h-[970px]"></iframe> Dark Mode is all implemented with CSS, controlled by toggling the **dark** class on the `<html class="dark">` element, `<DarkModeToggle>` also saves this user preference in `localStorage` where it's preserved across browser restarts. ### View in Dark Mode The Blazor Tailwind templates also include the ability to override the users color scheme preference and open a page in dark or light mode with the `?dark` and `?light` query params: <div class="not-prose relative"> <div class="grid grid-cols-1 sm:grid-cols-2 text-center"> <div> <a class="flex flex-col" href="https://blazor.web-templates.io/?light"> <div class="py-4">blazor.web-templates.io/?light</div> <img class="rounded-xl overflow-hidden" src="/img/pages/blazor/blazor-gallery-light.png"> </a> </div> <div> <a class="flex flex-col" href="https://blazor.web-templates.io/?dark"> <div class="py-4">blazor.web-templates.io/?dark</div> <img class="rounded-xl overflow-hidden" src="/img/pages/blazor/blazor-gallery-dark.png"> </a> </div> </div> </div> ### Force Dark Mode If your App is best viewed in Dark Mode you can force it to use Dark Mode with `JS.init()` when initializing ServiceStack.Blazor's JS library in Blazor Server's **_Layout.cshtml** or Blazor WASM's **index.html**, e.g: ```html <script src="_framework/blazor.server.js"></script> <script src="/js/servicestack-blazor.js"></script> <script>JS.init({ colorScheme:'dark' })</script> ``` <p class="hide-h2"></p> ## Blazor Tailwind Components <div id="blazor-components" class="not-prose mt-16 mb-8 ml-20 flex flex-col items-center"> <div class="flex"> <svg class="w-40 h-40 text-purple-600 mr-8" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15Z"/></svg> <svg class="w-44 h-44" xmlns="http://www.w3.org/2000/svg" width="256" height="154" viewBox="0 0 256 154"><defs><linearGradient id="logosTailwindcssIcon0" x1="-2.778%" x2="100%" y1="32%" y2="67.556%"><stop offset="0%" stop-color="#2298BD"/><stop offset="100%" stop-color="#0ED7B5"/></linearGradient></defs><path fill="url(#logosTailwindcssIcon0)" d="M128 0C93.867 0 72.533 17.067 64 51.2C76.8 34.133 91.733 27.733 108.8 32c9.737 2.434 16.697 9.499 24.401 17.318C145.751 62.057 160.275 76.8 192 76.8c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C174.249 14.743 159.725 0 128 0ZM64 76.8C29.867 76.8 8.533 93.867 0 128c12.8-17.067 27.733-23.467 44.8-19.2c9.737 2.434 16.697 9.499 24.401 17.318C81.751 138.857 96.275 153.6 128 153.6c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C110.249 91.543 95.725 76.8 64 76.8Z"/></svg> </div> <h2 class="border-none text-4xl sm:text-5xl md:text-6xl tracking-tight font-extrabold"> <span class="text-purple-600 mr-6">Blazor</span> <span class="mr-6" style="color:#44A8B3">Tailwind</span> <span class="text-gray-800">Components</span> </h2> </div> We encourage you to explore to the Blazor Gallery websites for the full preview, but we'll look at some of the Components here to give you some idea of the functionality available. ### DataGrid `DataGrid` is a versatile Component we expect to be heavily used for rendering any typed collection: ```html <DataGrid Model="Track" Items=@Track.Results /> ``` Which by default renders results in a striped Tailwind Table: ![](/img/pages/blazor/datagrid-tracks.png) Whose appearance can be styled to support many of the [Tailwind Table Styles](https://tailwindui.com/components/application-ui/lists/tables) with the `TableStyles` Flag enum, e.g: ```html <DataGrid Model="Track" Items=@Track.Results TableStyle="TableStyle.VerticalLines" /> <DataGrid Model="Track" Items=@Track.Results TableStyle="TableStyle.WhiteBackground" /> <DataGrid Model="Track" Items=@Track.Results TableStyle="TableStyle.FullWidth" /> <DataGrid Model="Track" Items=@Track.Results TableStyle="TableStyle.UppercaseHeadings | TableStyle.FullWidth | TableStyle.VerticalLines" /> ``` ![](/img/pages/blazor/datagrid-table-styles.png) It's a highly versatile component where you'll be able to control which columns are displayed and how they're formatted using `<Column>` definitions, e.g. here's how we can customize the table to look like Blazor's [FetchData.cshtml](https://github.com/SteveSanderson/Blazor/blob/master/samples/ClientServerApp/ClientServerApp.Client/FetchData.cshtml) tabular results: ```html <DataGrid Items=@forecasts class="max-w-screen-md" TableStyle="TableStyle.StripedRows | TableStyle.UppercaseHeadings"> <Column Field="(WeatherForecast x) => x.Date" Format="dd/MM/yyyy" /> <Column Title="Temp. (C)" Field="(WeatherForecast x) => x.TemperatureC" /> <Column Title="Temp. (F)" Field="(WeatherForecast x) => x.TemperatureF" /> <Column Field="(WeatherForecast x) => x.Summary" /> </DataGrid> @code { List<WeatherForecast> forecasts = new(); protected override async Task OnInitializedAsync() { forecasts = await Http.GetFromJsonAsync<List<WeatherForecast>>("data/weather.json") ?? new(); } } ``` [![](/img/pages/blazor/datagrid-weather.png)](https://blazor-gallery.servicestack.net/fetchdata) Here's a more advanced example showing how to implement a responsive DataGrid by utilizing custom Header and Table Cell templates to define what columns and Headers are visible at different responsive breakpoints and how to enable different features like **Row Selection** and **Filtering** and examples of handling the Row and Header selected events App's can use for executing custom logic: ```html <DataGrid Model="Booking" Items=@Items AllowSelection="true" AllowFiltering="true" HeaderSelected="HandleSelectedHeader" RowSelected="HandleSelectedRow"> <Column Field="(Booking x) => x.Id" class="text-gray-900" /> <Column Field="(Booking x) => x.Name" VisibleFrom="Breakpoint.ExtraLarge" /> <Column Field="(Booking x) => x.RoomType"> <Header> <span class="hidden lg:inline">Room </span>Type </Header> </Column> <Column Field="(Booking x) => x.RoomNumber"> <Header> <span class="hidden lg:inline">Room </span>No </Header> </Column> <Column Field="(Booking x) => x.Cost" Format="C" /> <Column Field="(Booking x) => x.BookingStartDate" Formatter="FormatDate" VisibleFrom="Breakpoint.Small"> <Header> Start<span class="hidden lg:inline"> Date</span> </Header> </Column> <Column Field="(Booking x) => x.BookingEndDate" Formatter="FormatDate" VisibleFrom="Breakpoint.ExtraLarge"> <Header> End<span class="hidden lg:inline"> Date</span> </Header> <Template>@{ var booking = context as Booking; }@booking.BookingEndDate?.ToString("D") </Template> </Column> <Column Title="Employee" Field="(Booking x) => x.CreatedBy" VisibleFrom="Breakpoint.Medium" /> </DataGrid> @code { public List<Booking> Items { get; set; } = new() { Create.Booking("First Booking!", RoomType.Queen, 10, 100, "employee@email.com", "BOOK10"), Create.Booking("Booking 2", RoomType.Double, 12, 120, "manager@email.com", "BOOK25"), Create.Booking("Booking the 3rd", RoomType.Suite, 13, 130, "employee@email.com", "BOOK50"), }; string FormatDate(object o) => o is DateTime d ? d.ToShortDateString() : ""; public async Task HandleSelectedHeader(Column<Booking> item) { await JS.Log(item.Name); } public async Task HandleSelectedRow(Booking x) { await JS.Log(x); } } ``` :::tip Resize webpage to preview its responsive appearance and different resolution breakpoints ::: <iframe class="w-full border-none" style="height:385px" src="https://blazor-gallery.jamstacks.net/gallery/datagrid/responsive"></iframe> ### AutoQueryGrid The functionality and extensibility in `DataGrid` lays the foundation for higher-level components like `AutoQueryGrid` which makes use of it to enable its Auto UI around [AutoQuery CRUD](/autoquery/crud) Services. ### AutoQueryGrid Read Only At a minimum AutoQueryGrid requires the AutoQuery APIs it should call to implement its functionality, so you can implement a **read-only** grid by only specifying the AutoQuery API to query a data model, e.g: ```csharp <AutoQueryGrid Model="Booking" Apis="Apis.AutoQuery<QueryBookings>()" /> ``` This one AutoQuery API is enough to power a functional read-only UI enabling multi flexible querying capabilities, paging, custom column selection and the ability to export the desired filtered resultset to .csv which can be open in Excel or copy the API URL Apps can use to consume the JSON API results: <iframe class="w-full border-none" style="height:430px" src="https://blazor-gallery.jamstacks.net/gallery/autoquerygrid/readonly"></iframe> ### AutoQueryGrid CRUD Full CRUD functionality can be enabled by specifying the AutoQuery CRUD APIs for a specified data model, e.g: ```csharp <AutoQueryGrid Model="Booking" Apis="Apis.AutoQuery<QueryBookings,CreateBooking,UpdateBooking,DeleteBooking>()" /> ``` [![](/img/pages/blazor/autoquerygrid-crud.png)](https://blazor-gallery.jamstacks.net/gallery/autoquerygrid/crud) ### Customizable Columns As `AutoQueryGrid` builds on `DataGrid` it also inherits its customizable option allowing for [customizable responsive columns](https://blazor-gallery.servicestack.net/gallery/autoquerygrid), e.g: ```html <AutoQueryGrid Model="Booking" Apis="Apis.AutoQuery<QueryBookings,CreateBooking,UpdateBooking,DeleteBooking>()" AllowSelection="true" AllowFiltering="true" HeaderSelected="OnSelectedHeader" RowSelected="OnSelectedRow"> <Columns> <!-- Custom class --> <Column Field="(Booking x) => x.Id" class="text-gray-900" /> <!-- Only show from Tailwind's xl responsive Breakpoint --> <Column Field="(Booking x) => x.Name" VisibleFrom="Breakpoint.ExtraLarge" /> <!-- Custom Header collapsing 'Room' below 'lg' responsive breakpoint --> <Column Field="(Booking x) => x.RoomType"> <Header> <span class="hidden lg:inline">Room </span>Type </Header> </Column> <!-- Custom Header collapsing 'Room' below 'lg' responsive breakpoint --> <Column Field="(Booking x) => x.RoomNumber"> <Header> <span class="hidden lg:inline">Room </span>No </Header> </Column> <!-- Custom string Format --> <Column Field="(Booking x) => x.Cost" Format="C" /> <!-- Custom C# Formatter --> <Column Field="(Booking x) => x.BookingStartDate" Formatter="FormatDate" VisibleFrom="Breakpoint.Small"> <Header> Start<span class="hidden lg:inline"> Date</span> </Header> </Column> <!-- Custom Header and Cell Value --> <Column Field="(Booking x) => x.BookingEndDate" VisibleFrom="Breakpoint.ExtraLarge2x"> <Header> End<span class="hidden lg:inline"> Date</span> </Header> <Template> @context.BookingEndDate?.ToString("D") </Template> </Column> <!-- Custom Title and Complex Type Cell with Reference Link --> <Column Title="Voucher" Field="(Booking x) => x.Discount" VisibleFrom="Breakpoint.ExtraLarge"> <Template> @if (context.Discount != null) { <TextLink class="flex items-end" href=@($"/gallery/autoquerygrid/coupons?Id={context.Discount.Id}")> <PreviewFormat Value=@context.Discount /> </TextLink> } </Template> </Column> </Columns> </AutoQueryGrid> ``` Customizing how and when columns are rendered at different breakpoints using different formatting options and custom table header and cell templates: [![](/img/pages/blazor/autoquerygrid-responsive.png)](https://blazor-gallery.jamstacks.net/gallery/autoquerygrid/responsive) ### Declarative Customizations The columns can also be customized declaratively using the `[Format]` Metadata Attribute on the Model type: ```csharp public class Contact : AuditBase { [AutoIncrement] public int Id { get; set; } [Format(FormatMethods.IconRounded)] public string ProfileUrl { get; set; } public string FirstName { get; set; } public string LastName { get; set; } [Format(FormatMethods.Currency)] public int? SalaryExpectation { get; set; } [Format(FormatMethods.LinkEmail, Options = @"{target:'_self',subject:'New Job Opportunity', body:'We have an exciting new opportunity...', cls:'text-green-600'}")] public string Email { get; set; } [Format(FormatMethods.LinkPhone)] public string Phone { get; set; } } ``` Which can change how results are formatted in the data grid results: [![](/img/pages/blazor/autoquerygrid-metadata.png)](https://blazor-gallery.jamstacks.net/grid/contacts-meta) Whilst the `[Input]` and `[FieldCss]` attributes on the AutoQuery CRUD DTOs: ```csharp public class UpdateContact : IPatchDb<Contact>, IReturn<Contact> { public int Id { get; set; } [ValidateNotEmpty] public string? FirstName { get; set; } [ValidateNotEmpty] public string? LastName { get; set; } [Input(Type = "file"), UploadTo("profiles")] public string? ProfileUrl { get; set; } public int? SalaryExpectation { get; set; } [ValidateNotEmpty] public string? JobType { get; set; } public int? AvailabilityWeeks { get; set; } public EmploymentType? PreferredWorkType { get; set; } public string? PreferredLocation { get; set; } [ValidateNotEmpty] public string? Email { get; set; } public string? Phone { get; set; } [Input(Type="textarea")] [FieldCss(Field="col-span-12 text-center", Input="h-48", Label="text-xl text-indigo-700")] public string? About { get; set; } } ``` Can customize how forms are rendered, e.g: [![](/img/pages/blazor/autoquerygrid-metadata-form.png)](https://blazor-gallery.jamstacks.net/grid/contacts-meta) ### ToolbarButtons The `<ToolbarButtons>` element can be used to customize the AutoQueryGrid to add your own custom Toolbar buttons, e.g: ```html <AutoQueryGrid> <ToolbarButtons> <div class="pl-2"><button>1</button></div> <div class="pl-2"><button>2</button></div> </ToolbarButtons> </AutoQueryGrid> ``` Enabling complete control over the Toolbar as all existing toolbar buttons can be removed with [AutoQueryGrid parameters](https://reference.servicestack.net/api/ServiceStack.Blazor.Components.Tailwind/AutoQueryGrid%60Model%60). ### Custom Edit and Create Forms The `<CreateForm>` and `<EditForm>` elements can be used to replace the default [Auto Forms](https://blazor-gallery.jamstacks.net/gallery/autoform) used in creating and editing rows when more advanced or customized functionality is needed. With this feature we can create a Custom AutoQueryGrid component that uses Custom Edit & Create Forms when selecting and adding new records and also customize the Grid results displayed with the new `ConfigureQuery` parameter to ensure results are filtered to the selected Tenant records: ```html <AutoQueryGrid @ref=@grid Model="Item" Apis="Apis.AutoQuery<QueryItems,NewItem,EditItem>()" ConfigureQuery="Configure"> <CreateForm> <div class="relative z-10"> <div class="pointer-events-none fixed inset-y-0 right-0 flex max-w-full pl-10 sm:pl-16"> <CustomCreateItem OnClose="grid!.OnEditDone" /> </div> </div> </CreateForm> <EditForm> <div class="relative z-10"> <div class="pointer-events-none fixed inset-y-0 right-0 flex max-w-full pl-10 sm:pl-16"> <CustomEditItem Item="context" OnClose="grid!.OnEditDone" /> </div> </div> </EditForm> </AutoQueryGrid> @code { AutoQueryGrid<Creative>? grid; [Parameter, SupplyParameterFromQuery] public int? TenantId { get; set; } void Configure(QueryBase query) { query.AddQueryParam("TenantId", TenantId); } } ``` ### Managing Filters & Preferences By default the AutoQueryGrid displays the user's selected columns and query limit preferences which are persisted in localStorage. They can be overridden with the new **Prefs** attribute which has different ergonomic methods for configuration within an attribute: To limit the Query Results Limit: ```html <AutoQueryGrid @ref=@grid Model="Contact" Apis="Apis.AutoQuery<QueryContacts,CreateContact,UpdateContact>()" Prefs="ApiPrefs.Create(take:10)" /> ``` To limit which columns are displayed in the Query Results: ```html <AutoQueryGrid @ref=@grid Model="Contact" Apis="Apis.AutoQuery<QueryContacts,CreateContact,UpdateContact>()" Prefs="ApiPrefs.Columns(nameof(Contact.Id), nameof(Contact.LastName))" /> ``` ```html <AutoQueryGrid @ref=@grid Model="Contact" Apis="Apis.AutoQuery<QueryContacts,CreateContact,UpdateContact>()" Prefs="ApiPrefs.Columns<Contact>(x => new { x.Id, x.LastName, x.Email })" /> ``` To configure both Query Limit and Selected Columns: ```html <AutoQueryGrid @ref=@grid Model="Contact" Apis="Apis.AutoQuery<QueryContacts,CreateContact,UpdateContact>()" Prefs="ApiPrefs.Create(take:10, columns:new(){ nameof(Contact.Id), nameof(Contact.LastName) })" /> ``` ```html <AutoQueryGrid @ref=@grid Model="Contact" Apis="Apis.AutoQuery<QueryContacts,CreateContact,UpdateContact>()" Prefs="ApiPrefs.Configure(x => { x.Take = 5; x.SelectedColumns=new() { nameof(Contact.LastName) }; })"/> ``` In addition the new methods below can be used to clear any user-defined query filters and column preferences: | Method | Description | |--|--| | grid.ClearFiltersAsync() | Remove user-defined Filters | | grid.ResetPreferencesAsync() | Remove user-defined Filters and Column Preferences | ### Disable Column Filtering By default Filtering and Sorting are disabled for complex type columns, they can also be explicitly disabled per column with `AllowFiltering`, e.g: ```html <AutoQueryGrid> <Column Field="(Contact x) => x.Phone" AllowFiltering="false" /> <AutoQueryGrid> ``` ### Changing AutoQueryGrid Defaults A lot of AutoQueryGrid's UI is customizable allowing you to easily toggle on/off UI features as needed, if you have a consistent style you wish to enforce you can change the defaults of every AutoQueryGrid component with [BlazorConfig](#blazor-config), e.g. you can remove **Copy URL** button and change the default Table style to use Uppercase Headings with: ```csharp BlazorConfig.Set(new() { //... AutoQueryGridDefaults = new() { TableStyle = TableStyle.StripedRows | TableStyle.UppercaseHeadings, ShowCopyApiUrl = false, } }); ``` Which will change the appearance of every `AutoQueryGrid` Component used in the App unless overridden. ### AutoQueryGrid Gallery As AutoQueryGrid is a core component for the rapid development of Apps we're maintaining a dedicated section showcasing their different features at [blazor-gallery.servicestack.net/grid](https://blazor-gallery.servicestack.net/grid): [![](/img/pages/blazor/autoquerygrid-more.png)](https://blazor-gallery.servicestack.net/grid) ## Modal Lookups To provide an optimal UX for relational fields `AutoQueryGrid` utilizes Modal Lookups for searching and selecting referential data that's automatically inferred from your OrmLite data model relationships, e.g: ```csharp public class JobApplication : AuditBase { [AutoIncrement] public int Id { get; set; } [References(typeof(Job))] public int JobId { get; set; } [References(typeof(Contact))] public int ContactId { get; set; } //... } ``` Where it will display an enhanced [LookupInput](https://reference.servicestack.net/api/ServiceStack.Blazor.Components.Tailwind/LookupInput) instead of a plain Text Input for the relational `JobId` and `ContactId` fields: [![](/img/pages/blazor/modalLookup-input.png)](https://blazor-gallery.servicestack.net/grid/job-applications?edit=1) Which users can use to quickly search for the related record instead of manually inserting Foreign Key Ids: [![](/img/pages/blazor/modalLookup-dialog.png)](https://blazor-gallery.servicestack.net/grid/job-applications?edit=1) ## File Uploads Another feature showcased in the above screenshots is support for [Managed File Uploads](/locode/files-overview) which can be declaratively added with the `[Input(Type="file")]` to render the [FileInput](https://reference.servicestack.net/api/ServiceStack.Blazor.Components.Tailwind/FileInput) Component and `[UploadTo]` attribute to specify which File [Upload location](/locode/files#basic-file-upload-example) it should use: ```csharp public class UpdateJobApplication : IPatchDb<JobApplication>, IReturn<JobApplication> { public int Id { get; set; } public int? JobId { get; set; } public int? ContactId { get; set; } public DateTime? AppliedDate { get; set; } public JobApplicationStatus? ApplicationStatus { get; set; } [Input(Type = "file"), UploadTo("applications")] public List<JobApplicationAttachment>? Attachments { get; set; } } ``` For a quick primer on using Managed File Uploads to [Upload files from Blazor](/locode/files-blazor) checkout: <div class="my-16 px-4 sm:px-6"> <div class="text-center"> <h1 class="text-4xl tracking-tight font-extrabold text-gray-900 sm:text-5xl"> <a href="/locode/files-blazor" class="block xl:inline">File Blazor</a> </h1> </div> <div class="my-8"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="5sd00MzHpaU" style="background-image: url('https://img.youtube.com/vi/5sd00MzHpaU/maxresdefault.jpg')"></lite-youtube> </div> </div> ## Auto Forms The Auto Form components are other high productivity components which can be used to create an automated form based from a Request DTO definition: ```csharp <AutoCreateForm Model="Booking" ApiType="typeof(CreateBooking)" /> ``` [![](/img/pages/blazor/AutoCreateForm.png)](https://blazor-gallery.servicestack.net/gallery/autoform) ### AutoEditForm Whilst `AutoEditForm` can be used to render an automated form based to update and delete an [AutoQuery CRUD](/autoquery/crud) API: ```csharp <AutoEditForm Model="Booking" Edit="Model" ApiType="typeof(UpdateBooking)" DeleteApiType="typeof(DeleteBooking)" /> @code { Booking Model = Create.Booking("First Booking!", RoomType.Queen, 10, 100, "employee@email.com"); } ``` [![](/img/pages/blazor/AutoEditForm.png)](https://blazor-gallery.servicestack.net/gallery/autoform) The forms behavior and appearance is further customizable with the [API annotation](/locode/declarative#annotate-apis), declarative [validation](/locode/declarative#type-validation-attributes) and the custom [Field and Input](/locode/declarative#custom-fields-and-inputs) attributes, e.g: ```csharp [Description("Update an existing Booking")] [Notes("Find out how to quickly create a <a class='svg-external' target='_blank' href='https://youtu.be/rSFiikDjGos'>C# Bookings App from Scratch</a>")] [Route("/booking/{Id}", "PATCH")] [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditModify)] public class UpdateBooking : IPatchDb<Booking>, IReturn<IdResponse> { public int Id { get; set; } public string? Name { get; set; } public RoomType? RoomType { get; set; } [ValidateGreaterThan(0)] public int? RoomNumber { get; set; } [ValidateGreaterThan(0)] public decimal? Cost { get; set; } public DateTime? BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [Input(Type = "textarea")] public string? Notes { get; set; } public bool? Cancelled { get; set; } } ``` Both AutoForm components will render the Forms UI in a Slide Over dialog and includes built-in support for calling the API to update or edit the record with integrated contextual validation, reporting any field validation errors alongside their Input controls. ### AutoFormFields If more advanced customization of a Forms appearance and behavior is required, you can use `AutoFormFields` to just render the Form's fields (including Validation binding) that can be used to populate a Request DTO that your App can handle sending, e.g: [![](/img/pages/blazor/AutoFormFields.png)](https://blazor-gallery.servicestack.net/gallery/autoform) ```csharp <form @onsubmit="submit" @onsubmit:preventDefault> <div class="shadow sm:overflow-hidden sm:rounded-md max-w-screen-lg"> <div class="space-y-6 bg-white py-6 px-4 sm:p-6"> <div> <h3 class="text-lg font-medium leading-6 text-gray-900">@(ApiType.GetDescription())</h3> <p class="notes mt-1 text-sm text-gray-500"> @((MarkupString)ApiType.GetNotes()) </p> </div> <AutoFormFields Type="typeof(Booking)" Api="Api" FormLayout="FormLayout" ModelDictionary="ModelDictionary"/> </div> <div class="bg-gray-50 px-4 py-3 text-right sm:px-12"> <PrimaryButton type="submit" onclick="submit">Save</PrimaryButton> </div> </div> </form> @code { [Inject] public JsonApiClient? Client { get; set; } IHasErrorStatus? Api { get; set; } Type ApiType = typeof(UpdateBooking); List<InputInfo>? FormLayout { get; set; } Dictionary<string, object> ModelDictionary { get; set; } = new(); MetadataType MetadataType => ApiType.ToMetadataType(); Booking Edit = Create.Booking("First Booking!", RoomType.Queen, 10, 100, "employee@email.com"); protected override async Task OnParametersSetAsync() { await base.OnParametersSetAsync(); Api = null; ModelDictionary = Edit.ToModelDictionary(); FormLayout ??= MetadataType.CreateFormLayout<Booking>(); } async Task submit() { var request = ModelDictionary.FromModelDictionary<UpdateBooking>(); Api = await Client!.ApiAsync(request); } } ``` ## Autocomplete The `<Autocomplete>` component provides a user friendly Input for being able to search and quickly select items, that includes support for rich templated content, custom matching and infinite scrolling that avoids pre-loading the entire bound list of items. Instead of being populated with a fixed List of strings or Key Value Pairs the Autocomplete component can bind directly to a list of POCOs to render its templated content where you'll be able to specify a custom `Match` filter to control which filtered items are displayed, that it can fuzzy match on single or multiple POCO properties. ### Single Contact Here's a simple Autocomplete example that binds to a **simple** `Contact` from a `List<Contact>` in **allContacts**: ```html <Autocomplete T="Contact" Options="allContacts" @bind-Value="simple" Label="Single Contact" Match="(x, value) => x!.DisplayName.Contains(value, StringComparison.OrdinalIgnoreCase)" placeholder="Select Contact"> <Item> <span class="block truncate">@context!.DisplayName</span> </Item> </Autocomplete> ``` ### Single Contact with Icon The item content is templated allowing for rich content which can be used to display a Contact's profile picture and name: ```html <Autocomplete T="Contact" Options="allContacts" @bind-Value="contact" Label="Single Contact with Icon" Match="(x, value) => x!.DisplayName.Contains(value, StringComparison.OrdinalIgnoreCase)" placeholder="Select Contact"> <Item> <div class="flex items-center"> <Icon class="h-6 w-6 flex-shrink-0 rounded-full" Src=@context.ProfileUrl /> <span class="ml-3 truncate">@context!.DisplayName</span> </div> </Item> </Autocomplete> ``` ### Multiple Contacts with Icon It also supports multiple selection by using `@bind-Values` to bind to the `List<Contact>` in **contacts** instead, e.g: ```html <Autocomplete Options="allContacts" @bind-Values="contacts" Label="Multiple Contacts with Icon" Match="(x, value) => x!.DisplayName.Contains(value, StringComparison.OrdinalIgnoreCase)" placeholder="Select Contacts"> <Item> <div class="flex items-center"> <Icon class="h-6 w-6 flex-shrink-0 rounded-full" Src=@context.ProfileUrl /> <span class="ml-3 truncate">@context!.DisplayName</span> </div> </Item> </Autocomplete> ``` and here's a working example of what they look like together in the same form ([example source code](https://blazor-gallery.jamstacks.net/gallery/inputs)): <div class="pt-12 flex justify-center"> <iframe src="https://blazor-gallery.jamstacks.net/gallery/inputs/autocomplete" class="w-[500px] border-none h-[560px]"></iframe> </div> ## TagInput The TagInput component is useful for when you want to manage a list of strings like words or tags - an input that's notably lacking in HTML Forms. Best of all `<TagInput>` functions like any other input where it can be included and customized in declarative forms. For example this Update AutoQuery Request DTO: ```csharp // Customize Edit Forms with [Input] and [FieldCss] attributes public class UpdateContact : IPatchDb<Contact>, IReturn<Contact> { public int Id { get; set; } [ValidateNotEmpty] public string? FirstName { get; set; } [ValidateNotEmpty] public string? LastName { get; set; } [Input(Type = "file"), UploadTo("profiles")] public string? ProfileUrl { get; set; } public int? SalaryExpectation { get; set; } [ValidateNotEmpty] public string? JobType { get; set; } public int? AvailabilityWeeks { get; set; } public EmploymentType? PreferredWorkType { get; set; } public string? PreferredLocation { get; set; } [ValidateNotEmpty] public string? Email { get; set; } public string? Phone { get; set; } [Input(Type = "tag"), FieldCss(Field = "col-span-12")] public List<string>? Skills { get; set; }} [Input(Type="textarea")] [FieldCss(Field="col-span-12 text-center", Input="h-48", Label="text-xl text-indigo-700")] public string? About { get; set; } } ``` is all that's needed to render an instantly working API-enabled Form with validation binding using [Auto Form components](https://blazor-gallery.jamstacks.net/gallery/autoform): ```html <AutoEditForm class=@Class Model="Contact" ApiType="typeof(UpdateContact)" Edit=@contact /> ``` Which by default renders the form in a SlideOver dialog as seen when editing a row in the [Contacts AutoQueryGrid](https://blazor-gallery.jamstacks.net/grid/contacts-meta?edit=1) component: <iframe src="https://blazor-gallery.jamstacks.net/grid/contacts-meta/embed?edit=1" class="mt-8 w-full border-none h-[970px]"></iframe> Alternatively it can be rendered in a traditional **"card"** form layout with the new `FormStyle.Card` option: ```html <AutoEditForm class=@Class FormStyle="FormStyle.Card" Model="Contact" ApiType="typeof(UpdateContact)" Edit=@contact /> ``` <iframe src="https://blazor-gallery.jamstacks.net/gallery/inputs/tag?layout=ExampleLayout&class=max-w-screen-md+mx-auto" class="mt-8 w-full border-none h-[1570px]"></iframe> Where it functions the same as other Input components where it can be bound directly to a `List<string>` Request DTO property: ```html <form @onsubmit="submit" @onsubmit:preventDefault class=@Class> <CascadingValue Value=@apiQuery.Error> <div class="shadow sm:rounded-md bg-white dark:bg-black"> <div class="relative px-4 py-5 sm:p-6"> <fieldset> <ErrorSummary Except=@VisibleFields /> <div class="grid grid-cols-12 gap-6"> <div class="col-span-6"> <TextInput @bind-Value="request.FirstName" /> </div> <div class="col-span-6"> <TextInput @bind-Value="request.LastName" /> </div> <div class="col-span-12"> <TagInput @bind-Value="request.Skills" /> </div> </div> </fieldset> </div> </div> </CascadingValue> </form> ``` ### NavList The NavList component encapsulates Tailwind's beautiful List component which is used extensively in [Blazor Gallery's Navigation](https://blazor-gallery.jamstacks.net/grid): ```html <div class="max-w-screen-sm"> <NavList Title="Explore Blazor Components"> <NavListItem Title="DataGrid" href="/gallery/datagrid" IconSvg=@Icons.DataGrid> DataGrid Component Examples for rendering tabular data </NavListItem> <NavListItem Title="AutoQuery Grid" href="/gallery/autoquerygrid" IconSvg=@Icons.AutoQueryGrid> Instant customizable UIs for calling AutoQuery CRUD APIs </NavListItem> </NavList> <h2 class="mt-8 text-base font-semibold text-gray-500 dark:text-gray-400 flex"> <span title="Requires Auth"><Icon class="h-6 w-6 mr-2" Svg=@Icons.Padlock /></span> Booking APIs </h2> <NavList> <NavListItem Title="Bookings" href="/grid/bookings" Icon=@typeof(Booking).GetIcon()> Create and manage Bookings </NavListItem> <NavListItem Title="Coupons" href="/grid/coupons" Icon=@typeof(Coupon).GetIcon()> Create and manage discount Coupons </NavListItem> </NavList> </div> ``` Where it will render a list of navigation links with descriptions and icons: <div class="my-8 flex justify-center"> <iframe src="https://blazor-gallery.jamstacks.net/gallery/navigation/navlist" class="my-8 w-full h-[510px] border-none"></iframe> </div> ### Colored Buttons The `ButtonStyle` on PrimaryButton component can be used to render buttons into Tailwind's different primary colors: ```csharp <div class="grid gap-4 grid-cols-3"> <PrimaryButton>Default</PrimaryButton> <PrimaryButton Style="ButtonStyle.Blue">Blue</PrimaryButton> <PrimaryButton Style="ButtonStyle.Purple">Purple</PrimaryButton> <PrimaryButton Style="ButtonStyle.Red">Red</PrimaryButton> <PrimaryButton Style="ButtonStyle.Green">Green</PrimaryButton> <PrimaryButton Style="ButtonStyle.Sky">Sky</PrimaryButton> <PrimaryButton Style="ButtonStyle.Cyan">Cyan</PrimaryButton> </div> ``` <div class="my-8 flex justify-center"> <iframe src="https://blazor-gallery.jamstacks.net/gallery/navigation/buttons/styles" class="my-8 w-full h-[160px] border-none"></iframe> </div> ### Select Input The `<SelectInput>` `Values` and `Entries` parameters can be used to populate options from an array of string's or KeyValuePair's, it also includes declarative features enabling more capable declarative forms which are typically restricted by the compile-time constant expression limitation of .NET attributes. The `EvalAllowableValues` and `EvalAllowableEntries` attribute properties overcomes this limitation by letting you define the Select options with a [#Script](https://sharpscript.net) Expression whose great [.NET scriptability](https://sharpscript.net/docs/script-net) lets you reference your App's .NET instances from a string expression. This feature can then be used to populate declarative Select options from a .NET Instance, e.g: ```csharp public class CreateModifier : ICreateDb<Modifier>, IReturn<Modifier> { [ValidateNotEmpty] public string Name { get; set; } [ValidateNotEmpty] [Input(Type="select", EvalAllowableValues = "AppData.Categories")] public string Category { get; set; } public string? Description { get; set; } } ``` That we register as a global variable in our AppHost's `ScriptContext` which we can populate from a dynamic source like a DB Table, e.g: ```csharp using var db = container.Resolve<IDbConnectionFactory>().Open(); ScriptContext.Args[nameof(AppData)] = new AppData { Categories = db.Column<string>(db.From<Category>().Select(x => x.Name)) }; ``` Where it will populate the Select input in all `CreateModifier` Auto Form components: <div class="mt-8 flex justify-center"> <img src="/img/pages/blazor/diffusion-CreateModifier.png" class="max-w-screen-md" style="border:1px solid #CACACA"> </div> ## PreviewFormat The `<PreviewFormat>` component is useful for rendering Table Cell data into different customizable formats, e.g: ```html <PreviewFormat Value="50" Format=Formats.Currency /> <PreviewFormat Value="1000000" Format=Formats.Bytes /> <PreviewFormat Value=@Url Format=Formats.Icon IconClass="w-40 h-40" /> <PreviewFormat Value=@Url Format=Formats.Icon IconClass="w-40 h-40 rounded-full" /> <PreviewFormat Value=@Url Format=Formats.Attachment /> <PreviewFormat Value=@Path Format=Formats.Attachment /> <PreviewFormat Value=@Url Format=Formats.Link /> <PreviewFormat Value=@Email Format=Formats.LinkEmail /> <PreviewFormat Value=@Phone Format=Formats.LinkPhone /> ``` [![](/img/pages/blazor/PreviewFormat.png)](https://blazor-gallery.servicestack.net/gallery/formats) ## HtmlFormat Whilst the versatile `<HtmlFormat>` component can be used to render any Serializable object into a human-friendly HTML Format, e.g: ### Single Model ```html <div class="max-w-screen-sm"> <HtmlFormat Value=@Track.Results[0] /> </div> ``` [![](/img/pages/blazor/HtmlFormat-single.png)](https://blazor-gallery.servicestack.net/gallery/formats) ### Item Collections ```html <div class="max-w-screen-sm"> <HtmlFormat Value=@Track.Results /> </div> ``` [![](/img/pages/blazor/HtmlFormat-collection.png)](https://blazor-gallery.servicestack.net/gallery/formats) ### Nested Complex Types ```html <HtmlFormat Value=@Create.Players /> ``` [![](/img/pages/blazor/HtmlFormat-complex.png)](https://blazor-gallery.servicestack.net/gallery/formats) For more info about the Blazor Components available checkout the [Component Gallery](https://blazor-gallery.servicestack.net/gallery): [![](/img/pages/blazor/component-gallery.png)](https://blazor-gallery.servicestack.net/gallery) ## Blazor Config A lot of the default conventions used by the Blazor Components are overridable with [BlazorConfig](https://reference.servicestack.net/api/ServiceStack.Blazor/BlazorConfig) initialized in `Program.cs`, where Blazor WASM projects configured with something like: ```csharp BlazorConfig.Set(new BlazorConfig { IsWasm = true, Services = app.Services, FallbackAssetsBasePath = apiBaseUrl, EnableLogging = true, EnableVerboseLogging = builder.HostEnvironment.IsDevelopment(), }); ``` ### Asset and Fallback Paths Where `FallbackAssetsBasePath` allows you to specify a fallback path for Images which is useful when there's a delay for syncing uploaded assets to the CDN that the Blazor WASM client is deployed to, as it can fallback to referencing the asset from the .NET App Server that handled the file upload. Alternatively `AssetsBasePath` can be used for specifying a different primary CDN location that's different from the Blazor WASM App CDN or `AssetsPathResolver` and `FallbackPathResolver` can be used when more a advanced custom strategy is required. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/blazor-diffusion.md --- title: Blazor Diffusion --- The goal of our increasing Blazor investments is to enable a highly productive and capable platform for rapidly developing a majority of internal Apps CRUD functionality as well as enabling a hybrid development model where the management of Back office supporting tables can be quickly implemented using custom AutoQueryGrid components freeing up developers to be able to focus a majority of their efforts where they add the most value - in the bespoke Blazor UI's optimized customer-facing UX. To best demonstrate its potential we've embarked on development of a new project we're excited to announce that does exactly this! <div class="not-prose my-8 flex justify-center"> <a href="https://blazordiffusion.com" class="flex items-center hover:no-underline" title="blazordiffusion.com"> <svg class="w-20 h-20 text-purple-600 mr-2" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15Z"/></svg> <h2 class="border-none text-4xl sm:text-5xl md:text-6xl tracking-tight font-extrabold"> <span class="text-purple-600 mr-6">Diffusion</span> </h2> </a> </div> [![blazordiffusion.com](/img/pages/blazor/blazordiffusion.com_splash.png)](https://blazordiffusion.com) [blazordiffusion.com](https://blazordiffusion.com) is a new ServiceStack.Blazor App front-end for [Stable Diffusion](https://en.wikipedia.org/wiki/Stable_Diffusion) - a deep learning text-to-image model that can generate quality images from a text prompt whose ability to run on commodity GPU hardware makes it one of the most exciting Open Source AI projects ever released. If you haven't experienced Stable Diffusion yet, we welcome you to create an account and start building your Stable Diffusion portfolio for FREE! ### Effortless Admin Pages It's a great example of Hybrid Development in action where the entire user-facing UI is a bespoke Blazor App that's optimized for creating, searching, cataloging and discovering Stable Diffusion generated images, whilst all its supporting admin tasks to manage the back office tables that power the UI were effortlessly implemented with custom AutoQueryGrid components. To get a glimpse of this in action we've created a video showing how quick it was to build the first few Admin Pages: <div class="my-8 flex justify-center"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="tt0ytzVVjEY" style="background-image: url('https://img.youtube.com/vi/tt0ytzVVjEY/maxresdefault.jpg')"></lite-youtube> </div> Blazor Diffusion is an example of a real-world App leveraging a number of different ServiceStack features to achieve its functionality that we're using to ["dog food"](https://en.wikipedia.org/wiki/Eating_your_own_dog_food) new ServiceStack features to help identify any friction points or missing functionality that we can feedback into the design and improvements of new and existing features, which it has done for most of the new features in this release. ### Blazor Server or Blazor WASM To ensure all new ServiceStack.Blazor features continue to work in both Blazor Server and Blazor WASM we're maintaining identical versions of Blazor Diffusion running in both of Blazor's hosting modes: <div class="py-8 flex justify-center"> <div class="flex flex-col"> <a href="https://github.com/NetCoreApps/BlazorDiffusion" class="text-xl text-gray-800 flex"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorDiffusion</span> </a> <a href="https://github.com/NetCoreApps/BlazorDiffusionWasm" class="mt-2 text-xl text-gray-800 flex"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorDiffusionWasm</span> </a> </div> </div> Where it's initially developed from a Blazor Server project template to take advantage of its fast iterative dev model then uses a [script to export](https://github.com/NetCoreApps/BlazorDiffusionWasm/blob/main/sync.bat) all Pages and Server functionality to a Blazor WASM project template that's optimal for Internet deployments. ### Blazor Diffusion Features To help discovery we'll link to where new features in this release are used. <svg class="w-20 h-20 text-gray-800" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M9.37 5.51A7.35 7.35 0 0 0 9.1 7.5c0 4.08 3.32 7.4 7.4 7.4c.68 0 1.35-.09 1.99-.27A7.014 7.014 0 0 1 12 19c-3.86 0-7-3.14-7-7c0-2.93 1.81-5.45 4.37-6.49zM12 3a9 9 0 1 0 9 9c0-.46-.04-.92-.1-1.36a5.389 5.389 0 0 1-4.4 2.26a5.403 5.403 0 0 1-3.14-9.8c-.44-.06-.9-.1-1.36-.1z"/></svg> ### Dark Mode The decision to build [blazordiffusion.com](https://blazordiffusion.com) was in large part due to choosing an App that would look best in Dark Mode, as-is often preferred when viewing images and video. The public UI uses [JS.init() to force Dark Mode](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion/Pages/_Layout.cshtml#L63) whilst the Admin Pages uses a different [AdminLayout.razor](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion/Shared/AdminLayout.razor) that allows dark mode to be toggled on/off as seen in the [BlazorDiffusion Video](https://www.youtube.com/watch?v=tt0ytzVVjEY). ### AutoComplete The [Create.razor](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion/Pages/Create.razor) uses the new `<Autocomplete>` to quickly select Artists and Modifiers. <div class="mt-8 flex justify-center"> <a href="https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Pages/Create.razor"> <img src="/img/pages/blazor/blazordiffusion-Autocomplete.png" style="width:600px"> </a> </div> ### Admin Pages The [/admin](https://github.com/NetCoreApps/BlazorDiffusion/tree/main/BlazorDiffusion/Pages/admin) pages we're all built using [AutoQueryGrid](https://blazor-gallery.jamstacks.net/grid) for its data management and uses [NavList and Breadcrumbs](https://blazor-gallery.jamstacks.net/gallery/navigation) for its navigation. <div class="flex justify-center"> <a href="https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Pages/admin/Index.razor"> <img src="/img/pages/blazor/blazordiffusion-admin-pages.png" style="width:600px"> </a> </div> #### EditForm The following components make use of `<EditForm>` AutoQueryGrid extensibility to display unique forms for their custom workflow requirements: - [Creatives.razor](https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Pages/admin/Creatives.razor) - [ArtifactAutoQueryGrid.razor](https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Shared/admin/ArtifactAutoQueryGrid.razor) - [ArtifactReportsAutoQueryGrid.razor](https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Shared/admin/ArtifactReportsAutoQueryGrid.razor) ```csharp <AutoQueryGrid @ref=@grid Model="Creative" Apis="Apis.AutoQuery<QueryCreatives,UpdateCreative,HardDeleteCreative>()" ConfigureQuery="ConfigureQuery"> <EditForm> <div class="relative z-10" aria-labelledby="slide-over-title" role="dialog" aria-modal="true"> <div class="pointer-events-none fixed inset-y-0 right-0 flex max-w-full pl-10 sm:pl-16"> <CreativeEdit Creative="context" OnClose="grid!.OnEditDone" /> </div> </div> </EditForm> </AutoQueryGrid> ``` ### SelectInput The [Modifiers.razor](https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Pages/admin/Modifiers.razor) admin page uses [SelectInput EvalAllowableValues](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion.ServiceModel/Creative.cs#L168-L187) feature to populate its options from a C# [AppData](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion.ServiceModel/AppData.cs) property: ```csharp public class CreateModifier : ICreateDb<Modifier>, IReturn<Modifier> { [ValidateNotEmpty, Required] public string Name { get; set; } [ValidateNotEmpty, Required] [Input(Type="select", EvalAllowableValues = "AppData.Categories")] public string Category { get; set; } public string? Description { get; set; } } ``` <div class="mt-8 flex justify-center"> <img src="/img/pages/blazor/diffusion-CreateModifier.png" class="max-w-screen-md" style="border:1px solid #CACACA"> </div> ### TagInput The [Artists.razor](https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Pages/admin/Artists.razor) admin page uses [declarative TagInput](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion.ServiceModel/Creative.cs#L122-L141) to render its AutoQueryGrid Create and Edit Forms: ```csharp public class UpdateArtist : IPatchDb<Artist>, IReturn<Artist> { public int Id { get; set; } public string? FirstName { get; set; } public string? LastName { get; set; } public int? YearDied { get; set; } [Input(Type = "tag"), FieldCss(Field = "col-span-12")] public List<string>? Type { get; set; } } ``` <div class="my-8 flex justify-center"> <img src="/img/pages/blazor/blazordiffusion-TagInput.png" class="max-w-screen-md" style="border:1px solid #CACACA"> </div> <h2 id="litestream" class="mx-auto max-w-screen-md text-center py-8 border-none"> <a href="https://litestream.io"> <img src="/img/pages/litestream/logo.svg"> </a> </h2> We're excited to be able to leverage our [support for Litestream](/ormlite/litestream) and showcase an example of architecting a production App at minimal cost which avoids paying for expensive managed hosted RDBMS's by effortlessly replicating its SQLite databases to object storage. <div class="mt-16 mx-auto max-w-7xl px-4"> <div class="text-center"> <h3 class="text-4xl tracking-tight font-extrabold text-gray-900 sm:text-5xl md:text-6xl"> <span class="block xl:inline">Reduce Complexity & Save Costs</span> </h3> <p class="mt-3 max-w-md mx-auto text-base text-gray-500 sm:text-lg md:mt-5 md:text-xl md:max-w-3xl"> Avoid expensive managed RDBMS servers, reduce deployment complexity, eliminate infrastructure dependencies & save order of magnitude costs vs production hosting </p> </div> <img src="/img/pages/litestream/litestream-costs.svg"> </div> To make it easy for Blazor Tailwind projects to take advantage of our first-class [Litestream support](/ormlite/litestream), we've created a new video combining these ultimate developer experience & value combo solutions that walks through how to deploy a new Blazor Tailwind SQLite + Litestream App to any Linux server with SSH access, Docker and Docker Compose: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="fY50dWszpw4" style="background-image: url('https://img.youtube.com/vi/fY50dWszpw4/maxresdefault.jpg')"></lite-youtube> ### Useful Blazor Litestream Video Links - [Blazor Litestream Tutorial](/blazor-litestream) - [Blazor](https://servicestack.net/blazor) - [Litestream](https://servicestack.net/litestream) - [Docker Install](https://docs.docker.com/engine/install/ubuntu/) - [Docker Compose Install](https://docs.docker.com/compose/install/linux/#install-using-the-repository) ### Custom SQLite functions Using SQLite also gives us access to features not available in other RDBMS's, e.g. for the "Explore Similar Artifacts" feature we're using a custom registered C# function that we can use in SQL to find other Artifacts with the nearest background colors in [SearchService.cs](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion.ServiceInterface/SearchService.cs): ```csharp public static class DbFunctions { public static void RegisterBgCompare(this IDbConnection db) { var sqliteConn = (SqliteConnection)db.ToDbConnection(); sqliteConn.CreateFunction("bgcompare", (string a, string b) => ImageUtils.BackgroundCompare(a, b)); } } ``` After registering the function with the db connection we can reference it in our typed SQL Expression with OrmLite's `Sql.Custom()` API: ```csharp db.RegisterBgCompare(); q.SelectDistinct<Artifact, Creative>((a, c) => new { a, c.UserPrompt, c.ArtistNames, c.ModifierNames, c.PrimaryArtifactId, Similarity = Sql.Custom($"bgcompare('{similarToArtifact.Background}',Background)"), }); ``` This same technique is also used for finding similar images by [PerceptualHash](https://www.hackerfactor.com/blog/index.php?/archives/432-Looks-Like-It.html), [AverageHash](https://www.hackerfactor.com/blog/index.php?/archives/432-Looks-Like-It.html) & [DifferenceHash](http://01101001.net/programming.php) functions provided by the [ImageHash](https://github.com/coenm/ImageHash) library. The [SearchService.cs](https://github.com/NetCoreApps/BlazorDiffusion/blob/v0.1/BlazorDiffusion.ServiceInterface/SearchService.cs) itself is a great example of a complex [custom AutoQuery implementation](/autoquery/rdbms#custom-autoquery-implementations) which is solely responsible for providing the entire search functionality on the home page. ### Hetzner US Cloud Our analysis of [US Cloud Hosting Providers](https://servicestack.net/blog/finding-best-us-value-cloud-provider) led us to moving to [Hetzner Cloud](https://www.hetzner.com/cloud) for hosting where it costs vastly less than equivalent specs at a major cloud provider. But this also meant we also had to look elsewhere to also avoid AWS's expensive egress costs for S3 for image storage which can easily get out of control for a highly traffic image host. ### R2 Virtual Files Provider Fortunately we were in time to take advantage of Cloudflare's inexpensive [R2 Object Storage solution](https://www.cloudflare.com/products/r2/) with **$0 egress fees**, together with their generous free tier and ability to serve R2 assets behind their free CDN, we ended up with great value and performance managed cloud storage solution with the only cost expected in the near future to be R2's **$0.015 / GB storage** cost. R2 is mostly S3 compatible however it needed a custom `S3VirtualFiles` provider to workaround missing features which is being maintained in the new `R2VirtualFiles` VFS provider. ### Files Upload Transformer The [Managed Files Upload Feature](/locode/files-overview) is configured in [Configure.AppHost.cs](https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion/Configure.AppHost.cs) and used for all website File Uploads: ```csharp var appFs = VirtualFiles = new R2VirtualFiles(s3Client, appConfig.ArtifactBucket); Plugins.Add(new FilesUploadFeature( new UploadLocation("artifacts", appFs, readAccessRole: RoleNames.AllowAnon, maxFileBytes: AppData.MaxArtifactSize), new UploadLocation("avatars", appFs, allowExtensions: FileExt.WebImages, // Use unique URL to invalidate CDN caches resolvePath: ctx => X.Map((CustomUserSession)ctx.Session, x => $"/avatars/{x.RefIdStr[..2]}/{x.RefIdStr}/{ctx.FileName}")!, maxFileBytes: AppData.MaxAvatarSize, transformFile:ImageUtils.TransformAvatarAsync) )); ``` It utilizes the new **transformFile:** option to transform an uploaded file and save a reference to the transformed file instead. This is used to only save a reference to the **128x128** resized avatar used by the App, whilst still persisting the original uploaded image in a [Background MQ](/background-mq) task in case a higher resolution of their avatar is needed later. ```csharp public class ImageDetails { public static async Task<IHttpFile?> TransformAvatarAsync(FilesUploadContext ctx) { var originalMs = await ctx.File.InputStream.CopyToNewMemoryStreamAsync(); // Offload persistance of original image to background task using var mqClient = HostContext.AppHost.GetMessageProducer(ctx.Request); mqClient.Publish(new DiskTasks { SaveFile = new() { FilePath = ctx.Location.ResolvePath(ctx), Stream = originalMs, } }); var resizedMs = await CropAndResizeAsync(originalMs, 128, 128, PngFormat.Instance); return new HttpFile(ctx.File) { FileName = $"{ctx.FileName.LastLeftPart('.')}_128.{ctx.File.FileName.LastRightPart('.')}", ContentLength = resizedMs.Length, InputStream = resizedMs, }; } public static async Task<MemoryStream> CropAndResizeAsync(Stream inStream, int width, int height, IImageFormat format) { var outStream = new MemoryStream(); var image = await Image.LoadAsync(inStream); using (image) { var clone = image.Clone(context => context .Resize(new ResizeOptions { Mode = ResizeMode.Crop, Size = new Size(width, height), })); await clone.SaveAsync(outStream, format); } outStream.Position = 0; return outStream; } } ``` ### Background MQ [Background MQ](/background-mq) is utilized to improve API response times by offloading a number of non-essential background tasks in [BackgroundMqServices.cs](https://github.com/NetCoreApps/BlazorDiffusion/blob/main/BlazorDiffusion.ServiceInterface/BackgroundMqServices.cs) to perform functions like: - Saving JSON metadata snapshot of Stable Diffusion generated images alongside the images themselves - Write Files to R2 - Recalculating temporal scores and ranking of Artifacts and Albums - Recording Analytics --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/templates/blazor-server.md --- title: .NET 10 Blazor Tailwind Templates --- <div class="not-prose"> <div id="blazor-server" class="hide-title mt-12 ml-20 flex flex-col items-center"> <div class="flex"> <svg class="w-24 h-24 text-purple-600 mr-8" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15Z"/></svg> <svg class="w-28 h-28" xmlns="http://www.w3.org/2000/svg" width="256" height="154" viewBox="0 0 256 154"><defs><linearGradient id="logosTailwindcssIcon0" x1="-2.778%" x2="100%" y1="32%" y2="67.556%"><stop offset="0%" stop-color="#2298BD"/><stop offset="100%" stop-color="#0ED7B5"/></linearGradient></defs><path fill="url(#logosTailwindcssIcon0)" d="M128 0C93.867 0 72.533 17.067 64 51.2C76.8 34.133 91.733 27.733 108.8 32c9.737 2.434 16.697 9.499 24.401 17.318C145.751 62.057 160.275 76.8 192 76.8c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C174.249 14.743 159.725 0 128 0ZM64 76.8C29.867 76.8 8.533 93.867 0 128c12.8-17.067 27.733-23.467 44.8-19.2c9.737 2.434 16.697 9.499 24.401 17.318C81.751 138.857 96.275 153.6 128 153.6c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C110.249 91.543 95.725 76.8 64 76.8Z"/></svg> </div> </div> <div class="relative bg-white dark:bg-black py-4"> <div class="mx-auto max-w-md px-4 text-center sm:max-w-3xl sm:px-6 lg:max-w-7xl lg:px-8"> <p class="mt-2 text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-4xl">Blazor Server Tailwind Template</p> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Ultimate dev model & UX ideal for low-latency Intranet environments </p> </div> </div> <a href="https://blazor.web-templates.io"> <div class="block flex justify-center shadow hover:shadow-lg rounded py-1"> <img class="p-4" src="https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/blazor.png"> </div> </a> </div> [ServiceStack.Blazor's Tailwind Components](/templates/blazor-components) also work flawlessly in [Blazor Server Apps](https://learn.microsoft.com/en-us/aspnet/core/blazor/hosting-models?view=aspnetcore-6.0#blazor-server) which benefits from fast startup and exceptional responsiveness in low latency environments thanks to its architecture of running your App in a server session that only needs to propagate thin UI Virtual DOM updates to clients. The Blazor Server App template offers a number compelling advantages over Blazor WASM, including: - A superior dev model and debugging experience - Improved live-reload and faster iterative dev cycles - Full access to .NET Server functionality - Better start times & UI responsiveness - Less complexity from unnecessary client project or pre-rendering solutions Although [the limitations](https://learn.microsoft.com/en-us/aspnet/core/blazor/hosting-models?view=aspnetcore-6.0#blazor-server) of its highly-coupled stateful server rendering session architecture does make it a poor fit for most high latency Internet sites which we continue to recommend our [Blazor WASM project template](/templates/blazor-tailwind) for. To better showcase our growing Blazor functionality we've created the new Blazor Gallery websites showcasing usage of available rich Blazor Components for rapidly develop beautiful Tailwind Web Apps: <div id="blazor-component-gallery" class="relative bg-white dark:bg-black py-4"> <div class="mx-auto max-w-md px-4 text-center sm:max-w-3xl sm:px-6 lg:max-w-7xl lg:px-8"> <p class="mt-2 text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-4xl">Blazor Gallery</p> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Discover ServiceStack.Blazor Rich UI Components and Integrated Features </p> </div> </div> [![](/img/pages/blazor/gallery-splash.png)](https://blazor-gallery.servicestack.net) As our components support both hosting models we're maintaining identical Gallery sites running on both **Blazor Server** and **WASM**: <div class="not-prose mb-16 mx-auto mt-5 max-w-md sm:flex sm:justify-center md:mt-8"> <div class="rounded-md shadow"> <a href="https://blazor-gallery.servicestack.net" class="flex w-full items-center justify-center rounded-md border border-transparent bg-indigo-600 px-8 py-3 text-base font-medium text-white hover:bg-indigo-700 md:py-4 md:px-10 md:text-lg hover:no-underline"> Blazor Server </a> </div> <div class="mt-3 rounded-md shadow sm:mt-0 sm:ml-3"> <a href="https://blazor-gallery.jamstacks.net" class="flex w-full items-center justify-center rounded-md border border-transparent bg-white px-8 py-3 text-base font-medium text-indigo-600 hover:bg-gray-50 md:py-4 md:px-10 md:text-lg hover:no-underline"> Blazor WASM </a> </div> </div> For a closer look at ServiceStack.Blazor Components in action, download & run them to see how good they'll run in your Environment: <div class="flex flex-col"> <a href="https://github.com/NetCoreApps/BlazorGallery" class="flex text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorGallery</span> </a> <a href="https://github.com/NetCoreApps/BlazorGalleryWasm" class="flex mt-2 text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorGalleryWasm</span> </a> <a href="https://docs.servicestack.net/vue/" class="flex mt-2 text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="w-28 h-28 sm:w-44 sm:h-44 iconify iconify--vscode-icons" width="1em" height="1em" viewBox="0 0 32 32"><path fill="#41b883" d="M24.4 3.925H30l-14 24.15L2 3.925h10.71l3.29 5.6l3.22-5.6Z"></path><path fill="#41b883" d="m2 3.925l14 24.15l14-24.15h-5.6L16 18.415L7.53 3.925Z"></path><path fill="#35495e" d="M7.53 3.925L16 18.485l8.4-14.56h-5.18L16 9.525l-3.29-5.6Z"></path></svg> <span>Vue Component Gallery</span> </a> </div> <p class="hide-h2"></p> ## Getting Started Customize and Download a new Blazor WASM Bootstrap project with your preferred project name: <h3 class="text-center">Download new C# Blazor Project</h3> <blazor-templates class="not-prose pb-8"></blazor-templates> Alternatively you can create & download a new Blazor Project with the [x dotnet tool](/dotnet-new): :::sh npx create-net blazor ProjectName ::: ## Universal Blazor Components Blazor Server has become our preferred platform for Interactive **Intranet** Apps which excels in **low-latency environments** to enable a best-in-class responsive end-user UX that also offers a superior development experience in Visual Studio's live reload where it enables a fast iterative development workflow and good debugging experience. A fantastic property of Blazor is its support for multiple hosting modes which allows the same components from being able to run in the Browser with Blazor WASM or rendered on the Server with Blazor Server. But whilst Blazor is capable of it, this trait is typically conceded in most Apps with database access where it recommends using [EF Core directly in Blazor components](https://learn.microsoft.com/en-us/aspnet/core/blazor/blazor-server-ef-core?view=aspnetcore-7.0) - effectively prohibiting reuse in Blazor WASM should you ever want to utilize Blazor's preferred hosting model for hosting your Blazor App's on the Internet. <div class="my-8 flex justify-center"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="66DgLHExC9E" style="background-image: url('https://img.youtube.com/vi/66DgLHExC9E/maxresdefault.jpg')"></lite-youtube> </div> ## Blazing Fast Networkless APIs Whilst better performing, having Blazor components access DB's directly encourages a more tightly-coupled and less reusable & testable architecture than the traditional well-defined API dev model used in client/server Mobile & Desktop Apps or Web SPA Apps like WASM. To achieve the best of both worlds, we've enabled support for utilizing the In Process [Service Gateway](/service-gateway) in Blazor Server Apps which lets you retain the traditional client/server dev model for invoking your Server APIs **In Process** - avoiding any serialization, HTTP networking or even Kestrel middleware overhead to invoke your APIs directly! This enables using the **exact same source code** to call APIs in Blazor Server and WASM which allows us to develop reusable Blazor Components to invoke the same Server APIs that serve Web, Mobile and Desktop Apps in Blazor Server Apps. Where instead of using HttpClient to invoke your APIs, they're invoked directly from a C# method which will preserve its StackTrace where you'll be able to track the API call down to the Blazor UI component calling it. ServiceStack's [Message-based API Design](/api-design) makes it possible for all API calls in ServiceStack.Blazor components and project templates to be routed through these 2 methods: ```csharp public interface IServiceGatewayAsync { Task<TResponse> SendAsync<TResponse>(object dto, CancellationToken ct=default); //... } public interface IServiceGatewayFormAsync { Task<TResponse> SendFormAsync<TResponse>(object dto, MultipartFormDataContent form, CancellationToken ct); } ``` ::: info The `SendFormAsync` API is a new method added to support multi-part API requests with File Uploads ::: Which allows the HTTP `JsonApiClient` and networkless `InProcessGateway` clients to be used interchangeably. By default Blazor Server Apps now use the InProcess Gateway but can be switched over to invoke APIs using the HTTP `JsonApiClient` with: ```csharp BlazorConfig.Set(new() { UseInProcessClient = false }); ``` Which changes all `Api*` methods in Blazor components and Pages inheriting ServiceStack.Blazor's [BlazorComponentBase](https://reference.servicestack.net/api/ServiceStack.Blazor/BlazorComponentBase) to use the registered `JsonApiClient` client. Other components can access both the InProcess Gateway or `JsonApiClient` by injecting the `IClientFactory` dependency into their components, e.g: ```csharp public class MyComponent : ComponentBase { [Inject] public IClientFactory? ClientFactory { get; set; } public IServiceGateway Gateway => ClientFactory!.GetGateway(); public JsonApiClient Client => ClientFactory!.GetClient(); } ``` This capability is what has made it possible for high-level "API-enabled" components like [AutoQuery Grids](https://blazor-gallery.jamstacks.net/grid) and [AutoForm](https://blazor-gallery.jamstacks.net/gallery/autoform) to support both Blazor Server and Blazor WASM utilizing the most efficient API client available to its platform. The Blazor Gallery websites themselves are also good demonstrations of being able to run **entire Web Apps** in both Blazor Server and WASM, with all development being done with Blazor Server to take advantage of its superior iterative dev model then a script is used to "export" all pages to an identical Blazor WASM project. ## Api and ApiAsync methods .NET was originally conceived to use Exceptions for error control flow however there's been a tendency in modern languages & libraries to shun Exceptions and return errors as normal values, an approach we believe is a more flexible & ergonomic way to handle API responses. ### The ApiResult way The `Api(Request)` and `ApiAsync(Request)` APIs returns a typed `ApiResult<Response>` Value Result encapsulating either a Typed Response or a structured API Error populated in `ResponseStatus` allowing you to handle API responses programmatically without `try/catch` handling: The below example code to create a new Booking: ```csharp CreateBooking request = new(); ApiResult<IdResponse> api = new(); async Task OnSubmit() { api = await Client.ApiAsync(request); if (api.Succeeded) { await done.InvokeAsync(api.Response!); request = new(); } } ``` Which despite its terseness handles both **success** and **error** API responses, **if successful** it invokes the `done()` callback notifying its parent of the new Booking API Response before resetting the Form's data model with a new Request DTO. Upon **failure** the error response is populated in `api.Error` which binds to the UI via Blazor's `<CascadingValue Value=@api.Error>` to propagate it to all its child components in order to show contextual validation errors next to their respective Input controls. ### Public Pages & Components To reduce boiler plate, your Blazor Pages & components can inherit the templates local [AppComponentBase.cs](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/AppComponentBase.cs) which inherits `BlazorComponentBase` that gets injected with the `IClientFactory` and provides convenient access to most common APIs: ```csharp public class BlazorComponentBase : ComponentBase, IHasJsonApiClient { [Inject] public IClientFactory? ClientFactory { get; set; } public IServiceGateway Gateway => ClientFactory!.GetGateway(); public JsonApiClient Client => ClientFactory!.GetClient(); public virtual Task<ApiResult<TResponse>> ApiAsync<TResponse>(IReturn<TResponse> request) => UseGateway ? Gateway.ManagedApiAsync(request) : Client.ManagedApiAsync(request); public virtual Task<ApiResult<EmptyResponse>> ApiAsync(IReturnVoid request); /*...*/ public virtual Task<TResponse> SendAsync<TResponse>(IReturn<TResponse> request); public virtual Task<IHasErrorStatus> ApiAsync<Model>(object request); public virtual Task<ApiResult<Model>> ApiFormAsync<Model>(object requestDto, MultipartFormDataContent request); } ``` ### Protected Pages & Components Pages and Components requiring Authentication should inherit from [AppAuthComponentBase](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/AppComponentBase.cs) instead which integrates with Blazor's Authentication Model to provide access to the currently authenticated user: ```csharp public abstract class AppAuthComponentBase : AppComponentBase { [CascadingParameter] protected Task<AuthenticationState>? AuthenticationStateTask { get; set; } protected bool HasInit { get; set; } protected bool IsAuthenticated => User?.Identity?.IsAuthenticated ?? false; protected ClaimsPrincipal? User { get; set; } protected override async Task OnParametersSetAsync() { var state = await AuthenticationStateTask!; User = state.User; HasInit = true; } } ``` ## Benefits of Shared DTOs Typically with Web Apps, our client is using a different language to C#, so an equivalent request DTOs need to be generated for the client. ### TypeScript Example For example, TypeScript generated DTOs still give us typed end-to-end services with the help of tooling like [Add ServiceStack Reference](/add-servicestack-reference) ```csharp [Route("/hello/{Name}")] public class Hello : IReturn<HelloResponse> { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` Turns into: ```typescript // @Route("/hello/{Name}") export class Hello implements IReturn<HelloResponse> { public name: string; public constructor(init?: Partial<Hello>) { (Object as any).assign(this, init); } public getTypeName() { return 'Hello'; } public getMethod() { return 'POST'; } public createResponse() { return new HelloResponse(); } } export class HelloResponse { public result: string; public responseStatus: ResponseStatus; public constructor(init?: Partial<HelloResponse>) { (Object as any).assign(this, init); } } ``` When Request or Response DTOs changes during development, the client DTOs need to be regenerated using a command like [`x csharp`](./add-servicestack-reference.md#simple-command-line-utilities). ### Blazor Server Example Developing your Blazor Server UI however, you just change your shared request/response DTO in the shared `ServiceModel` project, and both your client and server compile against the same request/response DTO classes. This eliminates the need for any additional step. In the `ServiceModel` project, we still have: ```csharp [Route("/hello/{Name}")] public class Hello : IReturn<HelloResponse> { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` Which the Blazor C# App can use directly in its **.razor** pages: ```csharp @code { Hello request = new() { Name = "Blazor WASM" }; ApiResult<HelloResponse> api = new(); protected override async Task OnInitializedAsync() => await submit(); async Task submit() => api = await ApiAsync(request); } ``` ## ServiceStack.Blazor Components The [ServiceStack.Blazor Components library](/templates/blazor-components) contains integrated functionality for Blazor including API-enabled base components, HTML Utils and Tailwind UI Input components heavily utilized throughout the template. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/templates/blazor-tailwind.md --- title: Blazor Tailwind Templates --- <div class="not-prose"> <div id="blazor-server" class="hide-title mt-12 ml-20 flex flex-col items-center"> <div class="flex"> <svg class="w-24 h-24 text-purple-600 mr-8" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15Z"/></svg> <svg class="w-28 h-28" xmlns="http://www.w3.org/2000/svg" width="256" height="154" viewBox="0 0 256 154"><defs><linearGradient id="logosTailwindcssIcon0" x1="-2.778%" x2="100%" y1="32%" y2="67.556%"><stop offset="0%" stop-color="#2298BD"/><stop offset="100%" stop-color="#0ED7B5"/></linearGradient></defs><path fill="url(#logosTailwindcssIcon0)" d="M128 0C93.867 0 72.533 17.067 64 51.2C76.8 34.133 91.733 27.733 108.8 32c9.737 2.434 16.697 9.499 24.401 17.318C145.751 62.057 160.275 76.8 192 76.8c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C174.249 14.743 159.725 0 128 0ZM64 76.8C29.867 76.8 8.533 93.867 0 128c12.8-17.067 27.733-23.467 44.8-19.2c9.737 2.434 16.697 9.499 24.401 17.318C81.751 138.857 96.275 153.6 128 153.6c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C110.249 91.543 95.725 76.8 64 76.8Z"/></svg> </div> </div> <div class="relative bg-white dark:bg-black py-4"> <div class="mx-auto max-w-md px-4 text-center sm:max-w-3xl sm:px-6 lg:max-w-7xl lg:px-8"> <p class="mt-2 text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-5xl">Blazor Tailwind Templates</p> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Rich Blazor templates with Tailwind CSS for building beautiful, responsive Apps </p> </div> </div> </div> The feature-rich Blazor Tailwind templates are ideal for teams with strong C# skills building Line Of Business (LOB) applications who prefer utilizing Tailwind's modern utility-first CSS design system to create beautiful, instant-loading Blazor Apps. [ServiceStack.Blazor's Tailwind Components](/templates/blazor-components) work seamlessly across both [Blazor Server](https://learn.microsoft.com/en-us/aspnet/core/blazor/hosting-models?view=aspnetcore-6.0#blazor-server) and Blazor WASM hosting models, allowing you to choose the best approach for your use case. <div class="py-8 max-w-7xl mx-auto px-4 sm:px-6"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="BXjcKkaK-nM" style="background-image: url('https://img.youtube.com/vi/BXjcKkaK-nM/maxresdefault.jpg')"></lite-youtube> </div> ## Getting Started Customize and Download a new Blazor Tailwind project with your preferred project name: <blazor-templates class="not-prose pb-8"></blazor-templates> Alternatively you can create & download a new Blazor Project with the [x dotnet tool](/dotnet-new): :::sh npx create-net blazor ProjectName ::: <a href="https://blazor.web-templates.io"> <div class="block flex justify-center shadow hover:shadow-lg rounded py-1"> <img class="p-4" src="https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/blazor.png"> </div> <div class="pt-4 text-center"> blazor.web-templates.io </div> </a> ## Blazor Rendering Modes ### Blazor Server Blazor Server has become the preferred platform for Interactive **Intranet** Apps which excels in **low-latency environments** to enable a best-in-class responsive end-user UX. It offers several compelling advantages: - A superior dev model and debugging experience - Improved live-reload and faster iterative dev cycles - Full access to .NET Server functionality - Better start times & UI responsiveness - Less complexity from unnecessary client project or pre-rendering solutions Although [the limitations](https://learn.microsoft.com/en-us/aspnet/core/blazor/hosting-models?view=aspnetcore-6.0#blazor-server) of its highly-coupled stateful server rendering session architecture does make it a poor fit for most high latency Internet sites. ### Blazor WASM with InteractiveAuto For Internet-facing applications, Blazor WASM with the `InteractiveAuto` render mode provides the best user experience. Our [blazor](https://github.com/NetCoreTemplates/blazor) template uses `InteractiveAuto` by default to provide a more responsive UI with static Server Side Rendering (SSR) for faster initial page loads and better SEO. ### Render Modes Blazor for .NET 10 has [four different rendering modes](https://learn.microsoft.com/en-us/aspnet/core/blazor/components/render-modes?view=aspnetcore-8.0#render-modes) you can take advantage of: - Static Server (static SSR) - Interactive Server - Interactive WebAssembly (WASM) - Interactive Auto For non-interactive pages, the static SSR mode is the fastest, as it renders the page on the server and sends the HTML to the client. However, when your page needs to be interactive, you need to use one of the interactive modes. Prior to .NET 10, there was a trade-off between the two available render modes (static server rendering wasn't yet available). The `Interactive Server` mode was faster to load, but the `Interactive WASM` mode was more responsive. The initial load times for `Interactive WASM` could be quite slow, as the entire application and all its dependencies needed to be downloaded before the page could render most of the content. <img class="border-gray-800 border-t border-r" src="/img/pages/blazor/wasm/blazor-wasm-6-slow.gif"> > The initial load time for the `Interactive WASM` mode can be quite slow even for a minimal app Our templates previously worked around this limitation with a custom Pre-Rendering solution, as the wait times were too long for a good user experience. .NET 10's new `Interactive Auto` mode provides the best of both worlds as pre-rendering is now enabled by default. <img class="border-gray-800 border-r" src="/img/pages/blazor/wasm/blazor-wasm-8-fast.gif"> When the page is first loaded, it uses the `Interactive Server` mode, which is faster than `Interactive WASM` as it doesn't need to download WASM resources. So the user can start interacting with the page straight away, but with a slight delay for each of their interactions due to having to perform round-trips to the server for each interaction. In the background, the WASM resources are downloaded which can then be used to render the site on the client for subsequent visits. ## Using InteractiveAuto in your Blazor application In Blazor for .NET 10, render modes can be set on both a per-page and per-component basis. ```html @page "/counter" @rendermode InteractiveAuto <Counter /> ``` ```html <Counter @rendermode="RenderMode.InteractiveAuto" /> ``` --- ## Universal Blazor Components A fantastic property of Blazor is its support for multiple hosting modes which allows the same components to run in the Browser with Blazor WASM or rendered on the Server with Blazor Server. But whilst Blazor is capable of it, this trait is typically conceded in most Apps with database access where it recommends using [EF Core directly in Blazor components](https://learn.microsoft.com/en-us/aspnet/core/blazor/blazor-server-ef-core?view=aspnetcore-7.0) - effectively prohibiting reuse in Blazor WASM should you ever want to utilize Blazor's preferred hosting model for hosting your Blazor App's on the Internet. <div class="my-8 flex justify-center"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="66DgLHExC9E" style="background-image: url('https://img.youtube.com/vi/66DgLHExC9E/maxresdefault.jpg')"></lite-youtube> </div> ### Blazing Fast Networkless APIs Whilst better performing, having Blazor components access DB's directly encourages a more tightly-coupled and less reusable & testable architecture than the traditional well-defined API dev model used in client/server Mobile & Desktop Apps or Web SPA Apps like WASM. To achieve the best of both worlds, we've enabled support for utilizing the In Process [Service Gateway](/service-gateway) in Blazor Server Apps which lets you retain the traditional client/server dev model for invoking your Server APIs **In Process** - avoiding any serialization, HTTP networking or even Kestrel middleware overhead to invoke your APIs directly! This enables using the **exact same source code** to call APIs in Blazor Server and WASM which allows us to develop reusable Blazor Components to invoke the same Server APIs that serve Web, Mobile and Desktop Apps in Blazor Server Apps. Where instead of using HttpClient to invoke your APIs, they're invoked directly from a C# method which will preserve its StackTrace where you'll be able to track the API call down to the Blazor UI component calling it. ServiceStack's [Message-based API Design](/api-design) makes it possible for all API calls in ServiceStack.Blazor components and project templates to be routed through these 2 methods: ```csharp public interface IServiceGatewayAsync { Task<TResponse> SendAsync<TResponse>(object dto, CancellationToken ct=default); //... } public interface IServiceGatewayFormAsync { Task<TResponse> SendFormAsync<TResponse>(object dto, MultipartFormDataContent form, CancellationToken ct); } ``` ::: info The `SendFormAsync` API is a new method added to support multi-part API requests with File Uploads ::: Which allows the HTTP `JsonApiClient` and networkless `InProcessGateway` clients to be used interchangeably. By default Blazor Server Apps now use the InProcess Gateway but can be switched over to invoke APIs using the HTTP `JsonApiClient` with: ```csharp BlazorConfig.Set(new() { UseInProcessClient = false }); ``` Which changes all `Api*` methods in Blazor components and Pages inheriting ServiceStack.Blazor's [BlazorComponentBase](https://reference.servicestack.net/api/ServiceStack.Blazor/BlazorComponentBase) to use the registered `JsonApiClient` client. Other components can access both the InProcess Gateway or `JsonApiClient` by injecting the `IClientFactory` dependency into their components, e.g: ```csharp public class MyComponent : ComponentBase { [Inject] public IClientFactory? ClientFactory { get; set; } public IServiceGateway Gateway => ClientFactory!.GetGateway(); public JsonApiClient Client => ClientFactory!.GetClient(); } ``` This capability is what has made it possible for high-level "API-enabled" components like [AutoQuery Grids](https://blazor-gallery.jamstacks.net/grid) and [AutoForm](https://blazor-gallery.jamstacks.net/gallery/autoform) to support both Blazor Server and Blazor WASM utilizing the most efficient API client available to its platform. The Blazor Gallery websites themselves are also good demonstrations of being able to run **entire Web Apps** in both Blazor Server and WASM, with all development being done with Blazor Server to take advantage of its superior iterative dev model then a script is used to "export" all pages to an identical Blazor WASM project. --- ## ServiceStack.Blazor Components The [ServiceStack.Blazor Components](https://blazor-gallery.jamstacks.net) have been updated for .NET 10 and work with the new `InteractiveAuto` render mode. This means you can focus more on your application logic and less on the UI, as the components provide a high-productivity UI for common tasks such as CRUD operations. ### AutoQueryGrid The [AutoQueryGrid](https://blazor-gallery.servicestack.net/gallery/autoquerygrid) component provides a full-featured data grid that can be used to display and edit data from an AutoQuery service. This is ideal for creating custom admin pages for your application. By integrating your admin screens into your application, you can optimize the user experience for specific workflows and get a huge amount of reuse of your existing AutoQuery services. ```html <AutoQueryGrid Model="Modifier" Apis="Apis.AutoQuery<QueryModifiers,CreateModifier,UpdateModifier,DeleteModifier>()" /> ``` :::{.wideshot} ![](/img/pages/blazor/wasm/autoquerygrid.png) ::: For [BlazorDiffusion](https://github.com/NetCoreApps/BlazorDiffusionAuto), our StableDiffusion example application, we used the AutoQueryGrid to create a custom admin page for managing the modifiers in the application. This is the simplest and fastest use of the AutoQueryGrid component, but it can also be heavily customized for lots of different use cases. In [BlazorDiffusion](https://github.com/NetCoreApps/BlazorDiffusionAuto) we customize the grid to enable easy navigation contextually between separate customized admin screens for each Creative, linking to related table data. :::{.wideshot} ![](/img/pages/blazor/wasm/blazordiffusion-creatives.png) ::: ```html <AutoQueryGrid @ref=@grid Model="Creative" Apis="Apis.AutoQuery<QueryCreatives,UpdateCreative,HardDeleteCreative>()" ConfigureQuery="ConfigureQuery"> <EditForm> <div class="relative z-10" aria-labelledby="slide-over-title" role="dialog" aria-modal="true"> <div class="pointer-events-none fixed inset-y-0 right-0 flex max-w-full pl-10 sm:pl-16"> <CreativeEdit Creative="context" OnClose="grid.OnEditDone" /> </div> </div> </EditForm> <Columns> <Column Title="User" Field="(Creative x) => x.OwnerId" /> <Column Title="Id" Field="(Creative x) => x.Id" /> <Column Field="(Creative x) => x.Modifiers"> <Template> @if (context.Modifiers?.Count > 0) { <TextLink class="flex" href=@($"/admin/modifiers?Ids={string.Join(",", context.Modifiers.Select(x => x.ModifierId))}")> <Icon class="w-6 h-6 mr-1" Image=@typeof(Modifier).GetIcon() /> @TextUtils.Pluralize("Modifier", context.Modifiers) </TextLink> } </Template> </Column> <Column Field="(Creative x) => x.Artists"> <Template> @if (context.Artists?.Count > 0) { <TextLink class="flex" href=@($"/admin/artists?Ids={string.Join(",", context.Artists.Select(x => x.ArtistId))}")> <Icon class="w-6 h-6 mr-1" Image=@typeof(Artist).GetIcon() /> @TextUtils.Pluralize("Artist", context.Artists) </TextLink> } </Template> </Column> <Column Field="(Creative x) => x.Artifacts"> <Template> @if (context.Artifacts?.Count > 0) { <TextLink class="flex" href=@($"/admin/artifacts?CreativeId={context.Id}")> <Icon class="w-6 h-6 mr-1" Image=@typeof(Artifact).GetIcon() /> @TextUtils.Pluralize("Artifact", context.Artifacts) </TextLink> } </Template> </Column> <Column Field="(Creative x) => x.Key" /> <Column Field="(Creative x) => x.CreatedDate" Format="s" /> <Column Field="(Creative x) => x.UserPrompt" /> </Columns> </AutoQueryGrid> ``` In the above example, we use the `ConfigureQuery` parameter to customize the query used by the AutoQueryGrid when displaying values. This is ideal if you want to filter the data for specific workflows, for example, only showing the data that is relevant to the current user. We combine this with a `Tabs` component to provide a navigation bar for the user to switch between the different filters on the same AutoQueryGrid. ```html <Tabs TabOptions="TabOptions" TabChanged="TabChangedAsync" /> ``` :::{.shadow .max-w-screen-sm} ![](/img/pages/blazor/wasm/blazordiffusion-tab.png) ::: <p></p> :::{.shadow .max-w-screen-sm} ![](/img/pages/blazor/wasm/blazordiffusion-tab1.png) ::: We also use the `EditForm` parameter to customize the edit form for the AutoQueryGrid, so the workflow for editing a Creative is optimized using your own completely custom UI. ```html <AutoQueryGrid @ref=@grid Model="Creative" Apis="Apis.AutoQuery<QueryCreatives,UpdateCreative,HardDeleteCreative>()" ConfigureQuery="ConfigureQuery"> <EditForm> <div class="relative z-10" aria-labelledby="slide-over-title" role="dialog" aria-modal="true"> <div class="pointer-events-none fixed inset-y-0 right-0 flex max-w-full pl-10 sm:pl-16"> <CreativeEdit Creative="context" OnClose="grid.OnEditDone" /> </div> </div> </EditForm> ``` ### Running on both Server vs Client When using the `InteractiveAuto` mode, first visits will be running on the server, so your pages and components need to be available to both projects, as well as have any required dependencies registered in both projects `Program.cs` files. By placing your shared pages and components in a shared project like the `.Client` project in the `blazor-wasm` template, you can easily share them between the two projects. Look for any of your pages or components that use the `@injects` directive, as these will need to be registered in both projects. ::: info Avoid sharing sensitive information via dependency injection, as this will be available to the client at runtime which will be able to be decompiled and inspected. ::: ### Source code and live demo The source code for the upgraded `BlazorDiffusionAuto` application is [available on GitHub](https://github.com/NetCoreApps/BlazorDiffusionAuto) and you can view a live demo of the application at [auto.blazordiffusion.com](https://auto.blazordiffusion.com). ### Conclusion The new `InteractiveAuto` mode in Blazor for .NET 10 provides the best of both worlds for Blazor applications. A built in pre-rendering solution means that you can have a fast initial load time, but still have a responsive UI for subsequent visits. And since the ServiceStack.Blazor components have been updated for .NET 10, you can take advantage of the high-productivity UI components to quickly create customizable and professional-looking admin pages in a Blazor application. --- <div id="blazor-components" class="hide-title not-prose mt-16 mb-8 ml-20 flex flex-col items-center"> <div class="flex"> <svg class="w-40 h-40 text-purple-600 mr-8" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15Z"/></svg> <svg class="w-44 h-44" xmlns="http://www.w3.org/2000/svg" width="256" height="154" viewBox="0 0 256 154"><defs><linearGradient id="logosTailwindcssIcon0" x1="-2.778%" x2="100%" y1="32%" y2="67.556%"><stop offset="0%" stop-color="#2298BD"/><stop offset="100%" stop-color="#0ED7B5"/></linearGradient></defs><path fill="url(#logosTailwindcssIcon0)" d="M128 0C93.867 0 72.533 17.067 64 51.2C76.8 34.133 91.733 27.733 108.8 32c9.737 2.434 16.697 9.499 24.401 17.318C145.751 62.057 160.275 76.8 192 76.8c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C174.249 14.743 159.725 0 128 0ZM64 76.8C29.867 76.8 8.533 93.867 0 128c12.8-17.067 27.733-23.467 44.8-19.2c9.737 2.434 16.697 9.499 24.401 17.318C81.751 138.857 96.275 153.6 128 153.6c34.133 0 55.467-17.067 64-51.2c-12.8 17.067-27.733 23.467-44.8 19.2c-9.737-2.434-16.697-9.499-24.401-17.318C110.249 91.543 95.725 76.8 64 76.8Z"/></svg> </div> <h2 class="border-none text-4xl sm:text-5xl md:text-6xl tracking-tight font-extrabold"> <span class="text-purple-600 mr-6">Blazor</span> <span class="mr-6" style="color:#44A8B3">Tailwind</span> </h2> </div> ### Blazor Tailwind Components [Tailwind](https://tailwindcss.com) has quickly become the best modern CSS framework we've used to create scalable, [mobile-first responsive](https://tailwindcss.com/#mobile-first) websites built upon a beautiful expert-crafted constraint-based [Design System](https://tailwindcss.com/#constraint-based) that enabled effortless reuse of a growing suite of [Free Community](https://tailwindcomponents.com) and professionally-designed [Tailwind UI Component Libraries](https://tailwindui.com) which has proven invaluable in quickly creating beautiful websites & docs that have benefited all our new modern jamstacks.net templates. [![](/img/pages/blazor/tailwindui.png)](https://tailwindui.com) ### ServiceStack.Blazor Components Many of Tailwind UI's popular components are encapsulated in ServiceStack.Blazor's righ high-level tailwind components to enable the rapid development of CRUD UIs in Blazor Server and WASM Apps: <div class="my-8 flex justify-center"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="iKpQI2233nY" style="background-image: url('https://img.youtube.com/vi/iKpQI2233nY/maxresdefault.jpg')"></lite-youtube> </div> <div id="blazor-component-gallery" class="mt-16 relative bg-white py-4"> <div class="mx-auto max-w-md px-4 text-center sm:max-w-3xl sm:px-6 lg:max-w-7xl lg:px-8"> <p class="mt-2 text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-4xl">Blazor Gallery</p> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Discover ServiceStack.Blazor Rich UI Components and Integrated Features </p> </div> </div> [![](/img/pages/blazor/gallery-splash.png)](https://blazor-gallery.servicestack.net) ServiceStack.Blazor Components support both hosting models which sees Blazor Gallery running on both **Blazor Server** and **WASM**: <div class="not-prose mb-16 mx-auto mt-5 max-w-md sm:flex sm:justify-center md:mt-8"> <div class="rounded-md shadow"> <a href="https://blazor-gallery.servicestack.net" class="flex w-full items-center justify-center rounded-md border border-transparent bg-indigo-600 px-8 py-3 text-base font-medium text-white hover:bg-indigo-700 md:py-4 md:px-10 md:text-lg hover:no-underline"> Blazor Server </a> </div> <div class="mt-3 rounded-md shadow sm:mt-0 sm:ml-3"> <a href="https://blazor-gallery.jamstacks.net" class="flex w-full items-center justify-center rounded-md border border-transparent bg-white px-8 py-3 text-base font-medium text-indigo-600 hover:bg-gray-50 md:py-4 md:px-10 md:text-lg hover:no-underline"> Blazor WASM </a> </div> </div> For a closer look at ServiceStack.Blazor Components in action, download & run them to see how good they'll run in your Environment: <div class="flex flex-col"> <a href="https://github.com/NetCoreApps/BlazorGallery" class="flex text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorGallery</span> </a> <a href="https://github.com/NetCoreApps/BlazorGalleryWasm" class="flex mt-2 text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path fill="currentColor" d="M12 .297c-6.63 0-12 5.373-12 12c0 5.303 3.438 9.8 8.205 11.385c.6.113.82-.258.82-.577c0-.285-.01-1.04-.015-2.04c-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729c1.205.084 1.838 1.236 1.838 1.236c1.07 1.835 2.809 1.305 3.495.998c.108-.776.417-1.305.76-1.605c-2.665-.3-5.466-1.332-5.466-5.93c0-1.31.465-2.38 1.235-3.22c-.135-.303-.54-1.523.105-3.176c0 0 1.005-.322 3.3 1.23c.96-.267 1.98-.399 3-.405c1.02.006 2.04.138 3 .405c2.28-1.552 3.285-1.23 3.285-1.23c.645 1.653.24 2.873.12 3.176c.765.84 1.23 1.91 1.23 3.22c0 4.61-2.805 5.625-5.475 5.92c.42.36.81 1.096.81 2.22c0 1.606-.015 2.896-.015 3.286c0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg> <span>NetCoreApps/BlazorGalleryWasm</span> </a> <a href="https://docs.servicestack.net/vue/" class="flex mt-2 text-xl text-gray-800"> <svg class="w-6 h-6 mr-2 align-text-bottom" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="w-28 h-28 sm:w-44 sm:h-44 iconify iconify--vscode-icons" width="1em" height="1em" viewBox="0 0 32 32"><path fill="#41b883" d="M24.4 3.925H30l-14 24.15L2 3.925h10.71l3.29 5.6l3.22-5.6Z"></path><path fill="#41b883" d="m2 3.925l14 24.15l14-24.15h-5.6L16 18.415L7.53 3.925Z"></path><path fill="#35495e" d="M7.53 3.925L16 18.485l8.4-14.56h-5.18L16 9.525l-3.29-5.6Z"></path></svg> <span>Vue Component Gallery</span> </a> </div> <div class="my-16 px-4 sm:px-6"> <div class="text-center"> <h1 class="text-4xl sm:text-5xl md:text-6xl tracking-tight font-extrabold text-gray-900"> <span class="block"> Creating Beautiful <span class="text-purple-600">Blazor Apps</span> </span> <span style="color:#44A8B3" class="block">with Tailwind</span> </h1> </div> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Preview the highly productive development model of the new Blazor Tailwind template showing how easy it is to utilize beautifully designed components </p> <div class="my-8"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="3gD_MMcYI-4" style="background-image: url('https://img.youtube.com/vi/3gD_MMcYI-4/maxresdefault.jpg')"></lite-youtube> </div> </div> <div class="relative bg-white py-4 mt-12"> <div class="mx-auto max-w-md px-4 text-center sm:max-w-3xl sm:px-6 lg:max-w-7xl lg:px-8"> <p class="mt-2 text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-4xl">Blazor Components</p> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Rich, themable UI Component Library with declarative contextual Validation </p> </div> </div> To maximize productivity the template utilizes the **ServiceStack.Blazor** library containing integrated functionality for Blazor including an optimal JSON API HttpClient Factory, API-enabled base components and a rich library of Tailwind & Bootstrap UI Input components with integrated contextual validation support of ServiceStack's [structured Error responses](/error-handling) heavily utilized throughout each project template. ### Blazor Tailwind UI Components The Built-in UI Components enable a clean & productive dev model, which as of this release include: | Component | Description | |-------------------|-----------------------------------------------------------------------------------| | `<TextInput>` | Text Input control for string properties | | `<DateTimeInput>` | Date Input control for Date properties | | `<CheckboxInput>` | Checkbox Input control for Boolean properties | | `<SelectInput>` | Select Dropdown for properties with finite list of values like Enums | | `<TextAreaInput>` | Text Input control for large strings | | `<DynamicInput>` | Dynamic component utilizing the appropriate above Input controls in Auto Forms | | `<AlertSuccess>` | Displaying successful notification feedback | | `<ErrorSummary>` | Displaying error summary message when no contextual field validation is available | | `<FileUpload>` | Used with `FilesUploadFeature` and `UploadTo` attribute to upload files | The Tailwind & Bootstrap components share the same functionally equivalent base classes that can be easily swapped when switching CSS frameworks by updating its namespace in your App's `_Imports.razor`. ```csharp @using ServiceStack.Blazor.Components.Tailwind //@using ServiceStack.Blazor.Components.Bootstrap ``` #### Themable Should it be needed, their decoupled design also allows easy customization by running the included [README.ss](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Shared/Components/README.ss) executable documentation to copy each controls **Razor** UI markup locally into your project, enabling easy customization of all UI input controls. ### Bookings CRUD Example To demonstrate ServiceStack's clean & highly productive Blazor dev model, we'll walk through implementing the [AutoQuery Bookings CRUD](/autoquery/crud-bookings) example in Blazor. Since we're using [AutoQuery CRUD](/autoquery/crud) we only need to define the Request DTO with the input fields we want the user to populate in our `Booking` RDBMS table in [Bookings.cs](https://github.com/NetCoreTemplates/blazor/blob/main/MyApp.ServiceModel/Bookings.cs): ```csharp [Tag("bookings"), Description("Create a new Booking")] [Route("/bookings", "POST")] [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditCreate)] public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { [Description("Name this Booking is for"), ValidateNotEmpty] public string Name { get; set; } public RoomType RoomType { get; set; } [ValidateGreaterThan(0)] public int RoomNumber { get; set; } [ValidateGreaterThan(0)] public decimal Cost { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [Input(Type = "textarea")] public string? Notes { get; set; } } ``` Where we make use of [Declarative Validation](/declarative-validation) attributes to define the custom validation rules for this API. ::: tip The `[Tag]`, `[Description]` and `[Input]` attributes are optional to markup how this API appears in ServiceStack's built-in [API Explorer](/api-explorer.html#details-tab) and [Locode UIs](/locode/declarative) ::: ### Blazor App Thanks to ServiceStack's [Recommended Project Structure](/physical-project-structure) no any additional classes are needed as we're able to bind UI Controls directly to our typed server `CreateBooking` Request DTO used to define the API in [BookingsCrud/Create.razor](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/Pages/BookingsCrud/Create.razor): ```csharp <form @onsubmit="_ => OnSubmit()" @onsubmit:preventDefault> <CascadingValue Value=@api.Error> <div class=@CssUtils.ClassNames("shadow overflow-hidden sm:rounded-md bg-white", @class)> <div class="relative px-4 py-5 bg-white sm:p-6"> <CloseButton OnClose="close" /> <fieldset> <legend class="text-base font-medium text-gray-900 text-center mb-4">New Booking</legend> <ErrorSummary Except=@VisibleFields /> <div class="grid grid-cols-6 gap-6"> <div class="col-span-6 sm:col-span-3"> <TextInput @bind-Value="request.Name" required placeholder="Name for this booking" /> </div> <div class="col-span-6 sm:col-span-3"> <SelectInput @bind-Value="request.RoomType" Options=@(Enum.GetValues<RoomType>()) /> </div> <div class="col-span-6 sm:col-span-3"> <TextInput type="number" @bind-Value="request.RoomNumber" min="0" required /> </div> <div class="col-span-6 sm:col-span-3"> <TextInput type="number" @bind-Value="request.Cost" min="0" required /> </div> <div class="col-span-6 sm:col-span-3"> <DateTimeInput @bind-Value="request.BookingStartDate" required /> </div> <div class="col-span-6 sm:col-span-3"> <DateTimeInput @bind-Value="request.BookingEndDate" /> </div> <div class="col-span-6"> <TextAreaInput @bind-Value="request.Notes" placeholder="Notes about this booking" /> </div> </div> </fieldset> </div> </div> </CascadingValue> </form> @code { [Parameter] public EventCallback<IdResponse> done { get; set; } [Parameter] public string? @class { get; set; } CreateBooking request = new() { BookingStartDate = DateTime.UtcNow, }; // Hide Error Summary Messages for Visible Fields which displays contextual validation errors string[] VisibleFields => new[] { nameof(request.Name), nameof(request.RoomType), nameof(request.RoomNumber), nameof(request.BookingStartDate), nameof(request.BookingEndDate), nameof(request.Cost), nameof(request.Notes), }; ApiResult<IdResponse> api = new(); async Task OnSubmit() { api = await ApiAsync(request); if (api.Succeeded) { await done.InvokeAsync(api.Response!); request = new(); } } async Task close() => await done.InvokeAsync(null); } ``` Calling ServiceStack APIs requires no additional code-gen or boilerplate where the populated Request DTO can be sent as-is using the [JsonApiClient Api methods](/csharp-client#high-level-api-and-apiasync-methods) which returns an encapsulated successful API or structured error response in its typed `ApiResult<T>`. The UI validation binding uses Blazor's `<CascadingValue>` to propagate any `api.error` responses down to the child Input components. That's all there's to it, we use [Tailwind's CSS Grid classes](https://tailwindcss.com/docs/grid-template-columns) to define our UI layout which shows each control in its own row for mobile UIs or 2 fields per row in resolutions larger than the [Tailwind's sm: responsive breakpoint](https://tailwindcss.com/docs/responsive-design) to render our beautiful Bookings Form: <div class="mx-auto max-w-screen-md text-center py-8"> <img src="/img/pages/blazor/bookings-create.png"> </div> Which utilizes both client and server validation upon form submission, displaying UX friendly contextual errors under each field when they violate any server [declarative validation](/declarative-validation) or Client UI **required** rules: <div class="mx-auto max-w-screen-md text-center py-8"> <img src="/img/pages/blazor/bookings-create-validation.png"> </div> ## Optimal Development Workflow Utilizing Blazor WebAssembly (WASM) with a ServiceStack backend yields an optimal frictionless [API First development model](/api-first-development) where UIs can bind directly to Typed DTOs whilst benefiting from ServiceStack's [structured error handling](/validation) & rich contextual form validation binding. By utilizing ServiceStack's [decoupled project structure](/physical-project-structure), combined with Blazor enabling C# on the client, we're able to get 100% reuse of your APIs shared DTOs as-is to enable an end-to-end Typed API automatically free from any additional tooling or code-gen complexity. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="BcQqCzm4tK0" style="background-image: url('https://img.youtube.com/vi/BcQqCzm4tK0/maxresdefault.jpg')"></lite-youtube> ## Api and ApiAsync methods .NET was originally conceived to use Exceptions for error control flow however there's been a tendency in modern languages & libraries to shun Exceptions and return errors as normal values, an approach we believe is a more flexible & ergonomic way to handle API responses. ### The ApiResult way The `Api(Request)` and `ApiAsync(Request)` APIs returns a typed `ApiResult<Response>` Value Result encapsulating either a Typed Response or a structured API Error populated in `ResponseStatus` allowing you to handle API responses programmatically without `try/catch` handling: The below example code to create a new Booking: ```csharp CreateBooking request = new(); ApiResult<IdResponse> api = new(); async Task OnSubmit() { api = await Client.ApiAsync(request); if (api.Succeeded) { await done.InvokeAsync(api.Response!); request = new(); } } ``` Which despite its terseness handles both **success** and **error** API responses, **if successful** it invokes the `done()` callback notifying its parent of the new Booking API Response before resetting the Form's data model with a new Request DTO. Upon **failure** the error response is populated in `api.Error` which binds to the UI via Blazor's `<CascadingValue Value=@api.Error>` to propagate it to all its child components in order to show contextual validation errors next to their respective Input controls. ## JSON API Client The recommended way for configuring a Service Client to use in your Blazor WASM Apps is to use `AddBlazorApiClient()`, e.g: ```csharp builder.Services.AddBlazorApiClient(builder.Configuration["ApiBaseUrl"] ?? builder.HostEnvironment.BaseAddress); ``` Which registers a typed Http Client factory returning a recommended pre-configured `JsonApiClient` to communicate with your back-end ServiceStack APIs including support for CORS, required when hosting the decoupled UI on a different server (e.g. CDN) to your server. If you're deploying your Blazor WASM UI to a CDN you'll need to specify the URL for the server, otherwise if it's deployed together with your Server App you can use the Host's Base Address. ### Public Pages & Components To reduce boiler plate, your Blazor Pages & components can inherit the templates local [AppComponentBase.cs](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/AppComponentBase.cs) which inherits `BlazorComponentBase` that gets injected with the `IClientFactory` and provides convenient access to most common APIs: ```csharp public class BlazorComponentBase : ComponentBase, IHasJsonApiClient { [Inject] public IClientFactory? ClientFactory { get; set; } public IServiceGateway Gateway => ClientFactory!.GetGateway(); public JsonApiClient Client => ClientFactory!.GetClient(); public virtual Task<ApiResult<TResponse>> ApiAsync<TResponse>(IReturn<TResponse> request) => UseGateway ? Gateway.ManagedApiAsync(request) : Client.ManagedApiAsync(request); public virtual Task<ApiResult<EmptyResponse>> ApiAsync(IReturnVoid request); /*...*/ public virtual Task<TResponse> SendAsync<TResponse>(IReturn<TResponse> request); public virtual Task<IHasErrorStatus> ApiAsync<Model>(object request); public virtual Task<ApiResult<Model>> ApiFormAsync<Model>(object requestDto, MultipartFormDataContent request); } ``` ### Protected Pages & Components Pages and Components requiring Authentication should inherit from [AppAuthComponentBase](https://github.com/LegacyTemplates/blazor-wasm/blob/main/MyApp.Client/AppComponentBase.cs) instead which integrates with Blazor's Authentication Model to provide access to the currently authenticated user: ```csharp public abstract class AppAuthComponentBase : AppComponentBase { [CascadingParameter] protected Task<AuthenticationState>? AuthenticationStateTask { get; set; } protected bool HasInit { get; set; } protected bool IsAuthenticated => User?.Identity?.IsAuthenticated ?? false; protected ClaimsPrincipal? User { get; set; } protected override async Task OnParametersSetAsync() { var state = await AuthenticationStateTask!; User = state.User; HasInit = true; } } ``` ## Benefits of Shared DTOs Typically with Web Apps, our client is using a different language to C#, so an equivalent request DTOs need to be generated for the client. ### TypeScript Example For example, TypeScript generated DTOs still give us typed end-to-end services with the help of tooling like [Add ServiceStack Reference](/add-servicestack-reference) ```csharp [Route("/hello/{Name}")] public class Hello : IReturn<HelloResponse> { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` Turns into: ```typescript // @Route("/hello/{Name}") export class Hello implements IReturn<HelloResponse> { public name: string; public constructor(init?: Partial<Hello>) { (Object as any).assign(this, init); } public getTypeName() { return 'Hello'; } public getMethod() { return 'POST'; } public createResponse() { return new HelloResponse(); } } export class HelloResponse { public result: string; public responseStatus: ResponseStatus; public constructor(init?: Partial<HelloResponse>) { (Object as any).assign(this, init); } } ``` When Request or Response DTOs changes during development, the client DTOs need to be regenerated using a command like [`x csharp`](./add-servicestack-reference.md#simple-command-line-utilities). ### Blazor Example Developing your Blazor UI however, you just change your shared request/response DTO in the shared `ServiceModel` project, and both your client and server compile against the same request/response DTO classes. This eliminates the need for any additional step. In the `ServiceModel` project, we still have: ```csharp [Route("/hello/{Name}")] public class Hello : IReturn<HelloResponse> { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` Which the Blazor C# App can use directly in its **.razor** pages: ```csharp @code { Hello request = new() { Name = "Blazor" }; ApiResult<HelloResponse> api = new(); protected override async Task OnInitializedAsync() => await submit(); async Task submit() => api = await ApiAsync(request); } ``` ### FileUpload Control The File Upload UI component used in the [File Blazor Demo](/locode/files-blazor) has been extracted into a reusable Blazor component you can utilize in your own apps: ![](/img/pages/templates/fileupload-blazor-usage-example.png) It's a simple control that takes advantage of ServiceStack's declarative [Managed File Uploads](/locode/files-overview) support to effortlessly enable multiple file uploads that can be declaratively added to any Request DTO, which only requires setting 2 properties: | Property | Description | |------------------|----------------------------------------------------------------------------------------------------| | Request | Request DTO object instance populated with into to be sent to your endpoint | | FilePropertyName | The name of the property that is used to reference your file, used with the `[UploadTo]` attribute | #### Example usage Below is an AutoQuery CRUD API example that references an upload location defined when configuring the [FileUploadFeature Plugin](/locode/files-upload-filesystem.md): ```csharp public class CreateMyDtoWithFileUpload : ICreateDb<MyDtoWithFileUpload>, IReturn<IdResponse> { [Input(Type="file"), UploadTo("fs")] public string FilePath { get; set; } public string OtherData { get; set; } } public class QueryFileUpload : QueryDb<MyDtoWithFileUpload> {} public class MyDtoWithFileUpload { [AutoIncrement] public int Id { get; set; } public string FilePath { get; set; } public string OtherData { get; set; } } ``` When calling this API, the Managed File Uploads feature will upload the HTTP File Upload included in the API request to the configured **fs** upload location and populate the uploaded path to the `FilePath` Request DTO property. The Blazor `FileUpload` Control handles the [C# File Upload API Request](/locode/files.html#uploading-files-from-c) by providing the Request DTO instance to send and the DTO property the File Upload should populate: ```html @page "/file-upload" <h3>FileUploadPage</h3> <FileUpload Request="request" FilePropertyName="@nameof(CreateMyDtoWithFileUpload.FilePath)" /> @code { // Any additional values should be populated // on the request object before the upload starts. CreateMyDtoWithFileUpload request = new() { OtherData = "Test" }; } ``` ![](/img/pages/templates/fileupload-blazor-example.png) The `FilePropertyName` matches the property name that is annotated by the `UploadTo` attribute. The `Request` is the instance of the Request DTO. ### Existing Template Upgrade for 6.3 If you created a `blazor-tailwind` project using this template before the ServiceStack 6.4 release, you should run the following commands to upgrade your project to use components from `ServiceStack.Blazor` component library which should be run from your `.Client` project: ::: sh npx add-in -delete blazor-upgrade-clean ::: ::: sh npx add-in blazor-upgrade ::: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/autoquery/bookings-crud.md --- title: AutoQuery CRUD Bookings Demo --- The powerfully productive combination of [AutoQuery](/autoquery/rdbms) and [Locode](/locode/) can be used to give **Authorized Users an Instant UI** to access AutoQuery Services resulting in an immediate fully queryable (inc. export to Excel) & management UI over system tables within minutes. By virtue of being normal ServiceStack Services, AutoQuery APIs also inherit ServiceStack's ecosystem of features like [Add ServiceStack Reference](/add-servicestack-reference) enabling high-performance end-to-end typed API access in all popular Web, Mobile & Desktop platforms. ## Creating a multi-user .NET Core Booking system in minutes! To see the rapid development of AutoQuery in action we've created a quick demo showing how to create a simple multi-user Booking System from an empty [web](https://github.com/NetCoreTemplates/web) project, [mixed in](/mix-tool) with the preferred RDBMS & Auth layered functionality, before enabling [Validation](/validation), [AutoQuery](/autoquery/rdbms), Admin Users & [CRUD Event Log](/autoquery/audit-log) plugins - to lay the foundational features before building our App by first defining its `Booking` data model & its surrounding **Query**, **Create**, **Update** and **Soft Delete** Typed CRUD APIs with rich validation enforced by declarative Validation attributes and multi-layer authorization rules & access permissions protected using Authorization attributes. All declarative functionality is accessible in Locode which is used to create new Employee & Manager Users, before signing in with each to hit the ground running and start entering new bookings using Locode's capability-based UI, with each change visible in its **Audit History**. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="rSFiikDjGos" style="background-image: url('https://img.youtube.com/vi/rSFiikDjGos/maxresdefault.jpg')"></lite-youtube> ### Download and Run The quickest way to run the [Bookings AutoQuery Example](https://github.com/NetCoreApps/BookingsCrud) is to install the [x tool](/dotnet-tool), download & run the repo: ```bash x download NetCoreApps/BookingsCrud cd BookingsCrud\Acme dotnet run ``` ### Custom project from Scratch If you have different App requirements you can instead create a project from scratch that integrates with your existing preferred infrastructure - the [mix tool](/mix-tool) and ServiceStack's layered [Modular Startup](/modular-startup) configurations makes this a cinch, start with an empty `web` project: :::sh npx create-net web ProjectName ::: Then mix in your desired features. E.g. In order for this project to be self-hosting it utilizes the embedded SQLite database, which we can configure along with configuration to enable popular Authentication providers and an RDBMS SQLite Auth Repository with: :::sh npx add-in auth auth-db sqlite ::: But if you also wanted to enable the new [Sign in with Apple](/auth/signin-with-apple) and use SQL Server you'll instead run: :::sh npx add-in auth-ext auth-db sqlserver ::: You can view all DB and Auth options available by searching for available layered gist configurations by tag: ```bash npx add-in [db] npx add-in [auth] ``` Typically the only configuration that needs updating is your DB connection string in [Configure.Db.cs](https://github.com/NetCoreApps/BookingsCrud/blob/main/Acme/Configure.Db.cs), in this case it's changed to use a persistent SQLite DB: ```csharp services.AddSingleton<IDbConnectionFactory>(new OrmLiteConnectionFactory( Configuration.GetConnectionString("DefaultConnection") ?? "bookings.sqlite", SqliteDialect.Provider)); ``` You'll also want to create RDBMS tables for any that doesn't exist: ```csharp using var db = appHost.Resolve<IDbConnectionFactory>().Open(); db.CreateTableIfNotExists<Booking>(); ``` ### Create Booking CRUD Services The beauty of AutoQuery is that we only need to focus on the definition of our C# POCO Data Models which OrmLite uses to create the RDBMS tables and AutoQuery reuses to generates the Typed API implementations enabling us to build full functional high-performance systems with rich querying capabilities that we can further enhance with declarative validation & authorization permissions and rich integrations with the most popular platforms without needing to write any logic. The `Booking` class defines the Data Model whilst the remaining AutoQuery & CRUD Services define the typed inputs, outputs and behavior of each API available that Queries and Modifies the `Booking` table. An added utilized feature are the `[AutoApply]` attributes which applies generic behavior to AutoQuery Services. The `Behavior.Audit*` behaviors below depend on the same property names used in the [AuditBase.cs](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/AuditBase.cs) class where: - `Behavior.AuditQuery` - adds an [Ensure AutoFilter](/autoquery/crud#autofilter) to filter out any deleted records - `Behavior.AuditCreate` - populates the `Created*` and `Modified*` properties with the Authenticated user info - `Behavior.AuditModify` - populates the `Modified*` properties with the Authenticated user info - `Behavior.AuditSoftDelete` - changes the behavior of the default **Real Delete** to a **Soft Delete** by populating the `Deleted*` properties ```csharp public class Booking : AuditBase { [AutoIncrement] public int Id { get; set; } public string Name { get; set; } public RoomType RoomType { get; set; } public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } public decimal Cost { get; set; } public string Notes { get; set; } public bool? Cancelled { get; set; } } public enum RoomType { Single, Double, Queen, Twin, Suite, } [AutoApply(Behavior.AuditQuery)] public class QueryBookings : QueryDb<Booking> { public int[] Ids { get; set; } } [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditCreate)] public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { public string Name { get; set; } [ApiAllowableValues(typeof(RoomType))] public RoomType RoomType { get; set; } [ValidateGreaterThan(0)] public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [ValidateGreaterThan(0)] public decimal Cost { get; set; } public string Notes { get; set; } } [ValidateHasRole("Employee")] [AutoApply(Behavior.AuditModify)] public class UpdateBooking : IPatchDb<Booking>, IReturn<IdResponse> { public int Id { get; set; } public string Name { get; set; } [ApiAllowableValues(typeof(RoomType))] public RoomType? RoomType { get; set; } [ValidateGreaterThan(0)] public int? RoomNumber { get; set; } public DateTime? BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } [ValidateGreaterThan(0)] public decimal? Cost { get; set; } public bool? Cancelled { get; set; } public string Notes { get; set; } } [ValidateHasRole("Manager")] [AutoApply(Behavior.AuditSoftDelete)] public class DeleteBooking : IDeleteDb<Booking>, IReturnVoid { public int Id { get; set; } } ``` ### Single Patch Partial Update API Previously the Edit UI required the full update `IUpdateDb<T>` API, but now supports falling back to using a partial `IPatchDb<T>` API (if exists) where it will instead **only update the modified fields** that have changed. Ultimately this means for most cases you'll only need to provide a single `IPatchDb<T>` API to update your data model as it allows for the most flexible functionality of only updating any **non-null** values provided. This does mean that every property other than the primary key should either be a **nullable reference or Value Type** (i.e. using `Nullable`). Using `IPatchDb<T>` Partial Updates are also beneficial in [crud audit logs](/autoquery/audit-log) as they only capture the fields that have changed instead of full record `IUpdateDb<T>` updates. `IPatchDb<T>` APIs can also be used to reset fields to `null` by specifying them in a `Reset` DTO string collection Property or **Request Param**, e.g. `?reset=Field1,Field2`. ## Manage in Locode After defining your AutoQuery APIs, start your App then you can use the built-in [Locode UI](/locode/) to manage Bookings at: <h3 class="text-4xl text-center text-indigo-800 pb-3"><span class="text-gray-300">https://example.org</span>/locode/</h3> [![](/img/pages/locode/bookings-locode.png)](/locode/) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/locode/branding.md --- title: Appearance & Branding --- The logo at the top left can be changed by configuring the `UiFeature` plugin from your AppHost using `ConfigurePlugin<UiFeature>`. ```csharp ConfigurePlugin<UiFeature>(feature => feature.Info.BrandIcon = new ImageInfo { Uri = "/logo.svg", Cls = "w-8 h-8 mr-1" }); ``` `Uri` is the path of your own logo from the `wwwroot` folder and the `Cls` value is the CSS classes applied to the image. <div class="not-prose"> <ul class="my-8 grid gap-4 sm:grid-cols-2"> <li class="rounded-lg m-0 col-span-1 flex flex-col text-center items-center bg-white shadow divide-y divide-gray-200"> <div class="flex-1 flex flex-col px-4 mt-4"> <div class="mt-4 p-0 text-xl font-medium text-gray-500">Default</div> <div class="rounded-lg focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-offset-gray-100 focus-within:ring-indigo-500 overflow-hidden"> <img src="/img/pages/locode/database-first-northwind-branding1.png" alt=""> </div> </div> </li> <li class="rounded-lg m-0 col-span-1 flex flex-col text-center items-center bg-white shadow divide-y divide-gray-200"> <div class="flex-1 flex flex-col px-4 mt-4"> <div class="mt-4 p-0 text-xl font-medium text-gray-500">Custom branding</div> <div class="rounded-lg focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-offset-gray-100 focus-within:ring-indigo-500 overflow-hidden"> <img src="/img/pages/locode/database-first-northwind-branding2.png" alt=""> </div> </div> </li> </ul> </div> ### Custom Brand Component For even greater flexibility you can also replace the entire [Brand.mjs component](/locode/custom-overview#custom-app-example) by creating a local `Brand` component in [/wwwroot/js/components/Brand.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Blazor/tests/ServiceStack.Blazor.Bootstrap.Tests/Server/js/components/Brand.mjs) which the Blazor WASM template does with: ```js const Brand = { template:/*html*/` <div class="flex items-center flex-shrink-0 max-w-sidebar"> <a title="My App" v-href="{ $page:'' }" class="text-2xl whitespace-nowrap overflow-x-hidden flex items-center"> <svg xmlns="http://www.w3.org/2000/svg" class="w-8 h-8 ml-1 mr-2" viewBox="0 0 24 24"> <path d="M23.834 8.101a13.912 13.912 0 0 1-13.643 11.72a10.105 10.105 0 0 1-1.994-.12a6.111 6.111 0 0 1-5.082-5.761a5.934 5.934 0 0 1 11.867-.084c.025.983-.401 1.846-1.277 1.871c-.936 0-1.374-.668-1.374-1.567v-2.5a1.531 1.531 0 0 0-1.52-1.533H8.715a3.648 3.648 0 1 0 2.695 6.08l.073-.11l.074.121a2.58 2.58 0 0 0 2.2 1.048a2.909 2.909 0 0 0 2.695-3.04a7.912 7.912 0 0 0-.217-1.933a7.404 7.404 0 0 0-14.64 1.603a7.497 7.497 0 0 0 7.308 7.405s.549.05 1.167.035a15.803 15.803 0 0 0 8.475-2.528c.036-.025.072.025.048.061a12.44 12.44 0 0 1-9.69 3.963a8.744 8.744 0 0 1-8.9-8.972a9.049 9.049 0 0 1 3.635-7.247a8.863 8.863 0 0 1 5.229-1.726h2.813a7.915 7.915 0 0 0 5.839-2.578a.11.11 0 0 1 .059-.034a.112.112 0 0 1 .12.053a.113.113 0 0 1 .015.067a7.934 7.934 0 0 1-1.227 3.549a.107.107 0 0 0-.014.06a.11.11 0 0 0 .073.095a.109.109 0 0 0 .062.004a8.505 8.505 0 0 0 5.913-4.876a.155.155 0 0 1 .055-.053a.15.15 0 0 1 .147 0a.153.153 0 0 1 .054.053A10.779 10.779 0 0 1 23.834 8.1zM8.895 11.628a2.188 2.188 0 1 0 2.188 2.188v-2.042a.158.158 0 0 0-.15-.15z" fill="#5C2D91"/> </svg> My App </a> </div> ` } export default Brand ``` To render its [custom App Brand component](https://vue-spa.web-templates.io/ui): [![](/img/pages/locode/custom-brand.png)](https://vue-spa.web-templates.io/ui) ## Custom Table Icons Attributes added to your database model can change the visuals in your Locode application. For example, by adding `[Icon]` top of `Booking` specifying either an `Svg` or `Uri` path we can change the icon for the table in left menu and table relationships. ```csharp [Icon(Svg = "<svg xmlns=\"http://www.w3.org/2000/svg\" ...")] public class Booking { ... } ``` Which will use this Icon whenever referring to `Booking` items: ![](/img/pages/locode/code-first-bookings-custom-1.png) ### Custom Icons for Database-first tables On database model classes, the `Icon` attribute can be used with a `Uri` or `Svg` to style the table in the left menu and when lookup data is displayed. For example, if we use the `TypeFilter` to access the data model types, we can apply the `Icon` attribute dynamically to `Order` it will impact the tables that reference `Order`. ```csharp TypeFilter = (type, req) => { if (Icons.TryGetValue(type.Name, out var icon)) type.AddAttribute(new IconAttribute { Svg = Svg.Create(icon) }); ... } public static Dictionary<string, string> Icons { get; } = new() { ["Order"] = "<path fill='currentColor' ...", }; ``` <div class="not-prose"> <ul class="my-8 grid gap-4 sm:grid-cols-2"> <li class="rounded-lg m-0 col-span-1 flex flex-col text-center items-center bg-white shadow divide-y divide-gray-200"> <div class="flex-1 flex flex-col px-4 mt-4"> <div class="mt-4 p-0 text-xl font-medium text-gray-500">Default Icon</div> <div class="rounded-lg focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-offset-gray-100 focus-within:ring-indigo-500 overflow-hidden"> <img src="/img/pages/locode/database-first-northwind-icons-default.png" alt=""> </div> </div> </li> <li class="rounded-lg m-0 col-span-1 flex flex-col text-center items-center bg-white shadow divide-y divide-gray-200"> <div class="flex-1 flex flex-col px-4 mt-4"> <div class="mt-4 p-0 text-xl font-medium text-gray-500">Custom Icon</div> <div class="rounded-lg focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-offset-gray-100 focus-within:ring-indigo-500 overflow-hidden"> <img src="/img/pages/locode/database-first-northwind-icons-custom.png" alt=""> </div> </div> </li> </ul> </div> ## Grouping services with Tag To group the Northwind services under the same `Tag` name for the left menu in Locode, we can use the `Tag` attribute. ```csharp [Tag("Northwind")] public class Category { ... } [Tag("Northwind")] public class Customer { ... } ``` Instead of `Tables` we can now see our `Northwind` tag in the Locode app UI. <div class="not-prose"> <ul class="my-8 grid gap-4 sm:grid-cols-2"> <li class="rounded-lg m-0 col-span-1 flex flex-col text-center items-center bg-white shadow divide-y divide-gray-200"> <div class="flex-1 flex flex-col px-4 mt-4"> <div class="mt-4 p-0 text-xl font-medium text-gray-500">Default "Tables"</div> <div class="rounded-lg focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-offset-gray-100 focus-within:ring-indigo-500 overflow-hidden"> <img src="/img/pages/locode/database-first-northwind-branding2.png" alt=""> </div> </div> </li> <li class="rounded-lg m-0 col-span-1 flex flex-col text-center items-center bg-white shadow divide-y divide-gray-200"> <div class="flex-1 flex flex-col px-4 mt-4"> <div class="mt-4 p-0 text-xl font-medium text-gray-500">Custom Tag</div> <div class="rounded-lg focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-offset-gray-100 focus-within:ring-indigo-500 overflow-hidden"> <img src="/img/pages/locode/database-first-northwind-tags.png" alt=""> </div> </div> </li> </ul> </div> As more unique `Tag` names are added, additional drop down menus will be created to group your services together. ### Adding Tags to Database-first tables We can add the `[Tag]` attribute to all our Database-First Request DTOs using [AutoQuery AutoGen's](/autoquery/autogen) `ServiceFilter`: ```cs GenerateCrudServices = new GenerateCrudServices { DbFactory = dbFactory, AutoRegister = true, ServiceFilter = (op, req) => { // Annotate all Auto generated Request DTOs with [Tag("Northwind")] attribute op.Request.AddAttributeIfNotExists(new TagAttribute("Northwind")); }, } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/built-in-profiling.md --- slug: built-in-profiling title: Built-in Mini Profiler --- ## Try the built-in Admin Logging & Profiling UI For a better integrated alternative to Mini Profiler checkout the built-in Admin [Logging & Profiling UI](/admin-ui-profiling). --- ## MVC Mini Profiler ServiceStack's [HTML5 JSON Report Format](/html5reportformat) also includes the [Mvc Mini Profiler](https://github.com/MiniProfiler/dotnet) - by [@jarrod_dixon](https://twitter.com/jarrod_dixon) and [@samsaffron](https://twitter.com/samsaffron). It's the same profiler used to profile and help speed up sites like [Stack Overflow](http://www.stackoverflow.com) and more recently the much faster [NuGet v2.0](http://nuget.org) website. The MVC Mini Profiler plugin is only available for classic ASP.NET Framework Web Apps, for .NET Core Apps you can use the official [MiniProfiler.AspNetCore.Mvc](https://www.nuget.org/packages/MiniProfiler.AspNetCore.Mvc) NuGet package, please see this community post for details: ### Using the MVC Mini Profiler The built-in Mini Profiler is available in ServiceStack's .NET Framework features package: :::copy `<PackageReference Include="ServiceStack.NetFramework" Version="10.*" />` ::: Just like the [Normal Mvc Mini Profiler](https://github.com/MiniProfiler/dotnet) you can enable it by registering the `MiniProfilerFeature` Plugin: ```csharp Plugins.Add(new MiniProfilerFeature()); ``` Then starting it in your Global.asax, here's how to enable it for local requests: ```csharp protected void Application_BeginRequest(object src, EventArgs e) { if (Request.IsLocal) Profiler.Start(); } protected void Application_EndRequest(object src, EventArgs e) { Profiler.Stop(); } ``` Now if you also have ServiceStack Razor views you can enable the profiler by putting this into your _Layout.cshtml page: ```csharp @ServiceStack.MiniProfiler.Profiler.RenderIncludes().AsRaw() ``` That's it! Now every time you view a web service or a razor page in your browser (locally) you'll see a profiler view of your service broken down in different stages: ![Hello MiniProfiler](/img/pages/advanced/miniprofiler-hello.png) By default you get to see how long it took ServiceStack to de-serialize your request, run any Request / Response Filters and more importantly how long it took to **Execute** your service. ## SQL Profiling The profiler includes special support for SQL Profiling that can easily be enabled for OrmLite and Dapper by getting it to use a Profiled Connection using a ConnectionFilter: ```csharp Plugins.Add(new MiniProfilerFeature()); // Register before using ProfiledDbConnection this.Container.Register<IDbConnectionFactory>(c => new OrmLiteConnectionFactory( "~/App_Data/db.sqlite".MapHostAbsolutePath(), SqliteDialect.Provider) { ConnectionFilter = x => new ProfiledDbConnection(x, Profiler.Current) }); ``` Refer to the [Main MVC MiniProfiler home page](https://github.com/MiniProfiler/dotnet) for instructions on how to configure profiling for Linq2Sql and EntityFramework. It's also trivial to add custom steps enabling even finer-grained profiling for your services. Here's a [simple web service DB example](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.IntegrationTests/Services/ProfilerService.cs) returning a list of Movies using both a simple DB query and a dreaded N+1 query. ```csharp public class MiniProfiler { public string Type { get; set; } } public class MiniProfilerService : Service { public object Any(MiniProfiler request) { var profiler = Profiler.Current; using (profiler.Step("MiniProfiler Service")) { if (request.Type == "n1") { using (profiler.Step("N + 1 query")) { var results = new List<Movie>(); foreach (var movie in Db.Select<Movie>()) { results.Add(Db.QueryById<Movie>(movie.Id)); } return results; } } using (profiler.Step("Simple Select all")) { return Db.Select<Movie>(); } } } } ``` ## View executed SQL Calling the above service normally provides the following Profiler output: ![Simple DB Example](/img/pages/advanced/miniprofiler-simpledb.png) Whilst calling the service with the **n1** param yields the following warning: ![Simple N+1 DB Example](/img/pages/advanced/miniprofiler-simpledb-n1.png) In both cases you see the actual SQL statements performed by clicking the **SQL** link. The N+1 query provides shows the following: ![N+1 DB Example SQL Statementes](/img/pages/advanced/miniprofiler-simpledb-n1-sql.png) Notice the special attention the MVC MiniProfiler team put into identifying **Duplicate** queries - Thanks Guys! # Community Resources - [MiniProfiler for AJAX calls with ServiceStack.Net](http://tunurgitr.blogspot.com/2012/04/miniprofiler-for-ajax-calls-with.html) by Jeff Harris --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ormlite/bulk-inserts.md --- title: Bulk Inserts --- <div class="py-8 max-w-7xl mx-auto px-4 sm:px-6"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="3gO_OEWIyPo" style="background-image: url('https://img.youtube.com/vi/3gO_OEWIyPo/maxresdefault.jpg')"></lite-youtube> </div> Bulk Insert implementations are available for each [supported RDBMS](/ormlite/installation) enabling the most efficient ways for inserting large amounts of data from code, which is encapsulated behind OrmLite's `BulkInsert` API: ```csharp db.BulkInsert(rows); ``` ## Bulk Insert Implementations Which uses the optimal implementation available for each RDBMS: - **PostgreSQL** - Uses PostgreSQL's [COPY](https://www.postgresql.org/docs/current/sql-copy.html) command via Npgsql's [Binary Copy](https://www.npgsql.org/doc/copy.html) import - **MySql** - Uses [MySqlBulkLoader](https://dev.mysql.com/doc/connector-net/en/connector-net-programming-bulk-loader.html) feature where data is written to a temporary **CSV** file that's imported directly by `MySqlBulkLoader` - **MySqlConnector** - Uses [MySqlConnector's MySqlBulkLoader](https://mysqlconnector.net/api/mysqlconnector/mysqlbulkloadertype/) implementation which makes use of its `SourceStream` feature to avoid writing to a temporary file - **SQL Server** - Uses SQL Server's `SqlBulkCopy` feature which imports data written to an in-memory `DataTable` - **SQLite** - SQLite doesn't have a specific import feature, instead Bulk Inserts are performed using batches of [Multiple Rows Inserts](https://www.tutorialscampus.com/sql/insert-multiple-rows.htm) to reduce I/O calls down to a configurable batch size - **Firebird** - Is also implemented using **Multiple Rows Inserts** within an [EXECUTE BLOCK](https://firebirdsql.org/refdocs/langrefupd20-execblock.html) configurable up to Firebird's maximum of **256** statements ## SQL Multiple Row Inserts All RDBMS's also support SQL's Multiple Insert Rows feature which is an efficient and compact alternative to inserting multiple rows within a single INSERT statement: ```sql INSERT INTO Contact (Id, FirstName, LastName, Age) VALUES (1, 'John', 'Doe', 27), (2, 'Jane', 'Doe', 42); ``` Normally OrmLite APIs uses parameterized statements however for Bulk Inserts it uses inline rasterized values in order to construct and send large SQL INSERT statements that avoids RDBMS's max parameter limitations, which if preferred can be configured to be used instead of its default RDBMS-specific implementation: ```csharp db.BulkInsert(rows, new BulkInsertConfig { Mode = BulkInsertMode.Sql }); ``` ## Batch Size **Multiple Row Inserts** are sent in batches of **1000** (Maximum for SQL Server), Firebird uses a maximum of **256** whilst other RDBMS's can be configured to use larger batch sizes: ```csharp db.BulkInsert(rows, new BulkInsertConfig { BatchSize = 1000 }); ``` ## Bulk Insert Benchmarks To test the performance of Bulk Inserts we've ran a number of benchmarks across macOS, Linux and Windows in our [Bulk Insert Performance](https://servicestack.net/posts/bulk-insert-performance) blog post. The Relative performances of Apple M2 macOS Benchmarks provide some indication of the performance benefits of Bulk Inserts you can expect, confirming that they offer much better performance when needing to insert a significant number of rows, we're it's up to **138x** more efficient than inserting just **1,000 rows**. :::{.table .table-striped .text-base} Relative performance for Inserting **1,000** records: | Database | Bulk Inserts | Multiple Rows Inserts | Single Row Inserts | |----------------|-------------:|----------------------:|-------------------:| | PostgreSQL | 1x | 1.32x | 57.04x | | MySqlConnector | 1x | 1.04x | 137.78x | | MySql | 1x | 1.16x | 131.47x | | SqlServer | 1x | 6.61x | 74.19x | Relative performance for Inserting **10,000** records: | Database | Bulk Inserts | Multiple Rows Inserts | |----------------|-------------:|----------------------:| | PostgreSQL | 1x | 3.37x | | MySqlConnector | 1x | 1.24x | | MySql | 1x | 1.52x | | SqlServer | 1x | 9.36x | Relative performance for Inserting **100,000** records: | Database | Bulk Inserts | Multiple Rows Inserts | |----------------|-------------:|----------------------:| | PostgreSQL | 1x | 3.68x | | MySqlConnector | 1x | 2.04x | | MySql | 1x | 2.31x | | SqlServer | 1x | 10.14x | ::: It also shows that batched Multiple Row Inserts Bulk Insert mode is another good option for inserting large number of rows that's within **3.4x** performance range of optimal Bulk Insert implementations, for all but SQL Server which is an order of magnitude slower than using `SqlBulkCopy`. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/cache-aware-clients.md --- slug: cache-aware-clients title: Cache Aware Service Clients --- To implement a complete end-to-end HTTP Caching story you can use the cache-aware `CachedServiceClient` to enhance all existing `HttpWebRequest` based Service Clients which manages its own local cache as instructed by the Server HTTP Caching directives, whilst the `CachedHttpClient` does the same for the HttpClient-based `JsonHttpClient`. Both Cache-Aware clients implement the full [IServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/docs/pages/IServiceClient.md) interface so they should be an easy drop-in enhancement for existing applications: ```csharp IServiceClient client = new JsonServiceClient(baseUrl).WithCache(); //equivalent to: IServiceClient client = new CachedServiceClient(new JsonServiceClient(baseUrl)); ``` Likewise for `JsonHttpClient`: ```csharp IServiceClient client = new JsonHttpClient(baseUrl).WithCache(); //equivalent to: IServiceClient client = new CachedHttpClient(new JsonHttpClient(baseUrl)); ``` As seen above both are decorators over existing .NET Service Clients where they'll append the appropriate HTTP Request Headers and inspect the HTTP Responses of **GET** Requests that contain HTTP Caching directives. All other HTTP Methods are just delegated through to the underlying Service Client. The Service Clients maintain cached responses in an internal dictionary which can also be injected and shared if your app uses multiple Service Clients. For example they could use the fast binary [MsgPack client](/messagepack-format) for performance-sensitive queries or Services returning binary data and use a JSON client for everything else: ```csharp var sharedCache = new ConcurrentDictionary<string, HttpCacheEntry>(); IServiceClient msgPackClient = new MsgPackServiceClient(baseUrl).WithCache(sharedCache); IServiceClient jsonClient = new JsonHttpClient(baseUrl).WithCache(sharedCache); ``` ## Improved Performance and Reliability When caching is enabled on Services, the Cache-aware Service Clients can dramatically improve performance by eliminating server requests entirely as well as reducing bandwidth for re-validated requests. They also offer an additional layer of resiliency as re-validated requests that result in Errors will transparently fallback to using pre-existing locally cached responses. For bandwidth-constrained environments like Mobile Apps they can dramatically improve the User Experience and as they're available in all supported PCL client platforms - we recommend their use where HTTP Caching is enabled on the Server. ## Community Resources - [Caching Anyone](http://www.mindkin.co.nz/blog/2016/1/5/caching-anyone) by [@JezzSantos](https://twitter.com/JezzSantos) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/cacheresponse-attribute.md --- slug: cacheresponse-attribute title: CacheResponse Attribute --- The `[CacheResponse]` is a normal [Request Filter Attribute](/filter-attributes) which can be added at the top-level of your Service class in which case it will cache the response of **All** Service implementations for **60 seconds**, e.g: ```csharp [CacheResponse(Duration = 60)] public class CachedServices : Service { public object Any(GetCustomer request) { ... } public object Any(GetCustomerOrders request) { ... } } ``` It can also be applied individually on a single Service implementation: ```csharp [CacheResponse(Duration = 60)] public object Any(GetCustomer request) { return Db.SingleById<Customer>(request.Id); } ``` ## Caching AutoQuery Services Request Filter attributes can also be applied on Request DTO's, as we seen with [AutoQuery DynamoDB's QueryRockstarAlbums](/autoquery/dynamodb#caching-autoquery-services) Request DTO: ```csharp [CacheResponse(Duration = 60)] public class QueryRockstarAlbums : QueryData<RockstarAlbum> { ... } ``` However adding Request Filter Attributes **on Request DTO's** goes against our recommendation for keeping your DTO's in a separate implementation and dependency-free **ServiceModel.dll** as it would require a dependency on the non-PCL **ServiceStack.dll** which would prohibit being able to reuse your existing DTO .dll in PCL libraries, limiting their potential re-use. You can still take advantage of the `[CacheResponse]` attribute on AutoQuery Services by defining a custom implementation, at which point adding the `[CacheResponse]` attribute behaves as normal and applies caching to your Service implementations. E.g. you can enable caching for multiple AutoQuery Services with: ```csharp [CacheResponse(Duration = 60)] public class MyCachedAutoQueryServices : Service { public IAutoQueryData AutoQuery { get; set; } public object Any(QueryRockstars query) => AutoQuery.Execute(query, AutoQuery.CreateQuery(query, Request), Request); public object Any(QueryRockstarAlbums query) => AutoQuery.Execute(query, AutoQuery.CreateQuery(query, Request), Request); } ``` ### Server Cached and [HTTP Caching](/http-caching) enabled responses When only specifying a `Duration=60` ServiceStack only **caches the Server Response** so it behaves similar to using the existing `ToOptimizedResult()` API, e.g: ```csharp public object Any(GetCustomer request) { return Request.ToOptimizedResultUsingCache(Cache, Request.RawUrl, TimeSpan.FromSeconds(60), () => Db.SingleById<Customer>(request.Id)); } ``` To also enable [HTTP Caching](/http-caching) features you'll need to opt-in by specifying an additional HTTP Caching directive. E.g. including a `MaxAge` instructs ServiceStack to apply **HTTP Caching** logic and return the appropriate headers: ```csharp [CacheResponse(Duration=60, MaxAge=30)] public object Any(GetCustomer request) => Db.SingleById<Customer>(request.Id); ``` Where subsequent identical requests from a **cache-aware client** will return their locally cached version within the first **30 seconds**, between **30-60 seconds** the client will re-validate the request with the Server who will return a **304 NotModified** Response with an **Empty Body**, after **60 seconds** the cache expires and the next request will **re-execute the Service** and populate the cache with a new response. #### CacheResponse Properties The Caching behavior of the `[CacheResponse]` attribute can be further customized using any of the additional properties below: ```csharp int Duration // Cache expiry in seconds int MaxAge // MaxAge in seconds CacheControl CacheControl // Customize Cache-Control HTTP Headers bool VaryByUser // Vary cache per user string[] VaryByRoles // Vary cache for users in these roles bool LocalCache // Use In Memory HostContext.LocalCache or HostContext.Cache ``` Using any of the other HTTP Cache properties will also trigger the HTTP Caching features. When a `MaxAge` isn't specified, i.e: ```csharp [CacheResponse(Duration = 10, VaryByUser = true)] public object Any(GetUserActivity request) { ... } ``` ServiceStack falls back to use the `HttpCacheFeature.DefaultMaxAge` which defaults to **10 minutes**, in addition to the `VaryByUser` flag will construct a unique cache key for each user and return an additional `Vary: Cookie` HTTP Response Header. ## Advanced CacheInfo Customization One limitation of using a .NET Attribute to specify caching behavior is that we're limited to using .NET constant primitives prohibiting the use of allowing custom lambda's to capture custom behavior. This is also the reason why we need to use `int` for `Duration` and `MaxAge` instead of a more appropriate `TimeSpan`. But we can still intercept the way the `[CacheResponse]` attribute works behind-the-scenes and programmatically enhance it with custom logic. [CacheResponseAttribute](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/CacheResponseAttribute.cs) is just a wrapper around initializing a populated [CacheInfo](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/CacheInfo.cs) POCO that it drops into the `IRequest.Items` dictionary where it's visible to your Service and any remaining Filters in ServiceStack's [Request Pipeline](/order-of-operations). Essentially it's just doing this: ```csharp req.Items[Keywords.CacheInfo] = new CacheInfo { ... }; ``` The actual validation logic for processing the `CacheInfo` is encapsulated within the `HttpCacheFeature` Response Filter. This gives our Service a chance to modify it's behavior, e.g. in order to generically handle all Service responses the `[CacheResponse]` attribute uses the `IRequest.RawUrl` (the URL minus the domain) for the base CacheKey. Whilst using a RawUrl is suitable in uniquely identifying most requests, if QueryString params were sent in a different case or in a different order it would generate a different url and multiple caches for essentially the same request. We can remedy this behavior by changing the base CacheKey used which is just a matter retrieving the populated the `CacheInfo` and change the `KeyBase` to use the predictable [Reverse Routing](/routing#reverse-routing) `ToGetUrl()` API instead, e.g: ```csharp [CacheResponse(Duration = 60)] public async Task<object> Get(MyRequest request) { var cacheInfo = (CacheInfo)base.Request.GetItem(Keywords.CacheInfo); cacheInfo.KeyBase = request.ToGetUrl(); //custom cache key if (await Request.HandleValidCache(cacheInfo)) return null; ... return response; } ``` Or generically for all cached Services by using a [Global Request Filter](/request-and-response-filters): ```csharp this.GlobalRequestFiltersAsync.Add(async (req, res, requestDto) => { var cacheInfo = req.GetItem(Keywords.CacheInfo) as CacheInfo; if (cacheInfo?.KeyBase != null) { cacheInfo.KeyBase = request.ToGetUrl(); //custom cache key await req.HandleValidCache(cacheInfo); } }); ``` When using a Global Request Filter to customize caching behavior as above, your `[CacheResponse]` should have a priority `<0` in order for it to be executed before any Global Request Filters, e.g: ```csharp [CacheResponse(Priority = -1, Duration = 10 * 60)] public class MyCachedServices : Service { ... } ``` `HandleValidCache()` is used to re-validate the client's request with the new Cache Key and if it's determined the Client has a valid cache, will short-circuit the Service and return a **304 NotModified** Response. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/caching.md --- slug: caching title: Caching Providers --- As caching is an essential technology in the development of high-performance web services, ServiceStack has a number of different caching options available that each share the same [common client interface (ICacheClient)](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Caching/ICacheClient.cs) for the following cache providers: * [Memory Cache](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Caching/MemoryCacheClient.cs) - Useful for single host web services without needing any infrastructure dependencies. * [Redis](https://github.com/ServiceStack/ServiceStack.Redis) - A fast key-value store with non-volatile persistent storage and support for rich comp-sci data structures. * [OrmLiteCacheClient](https://www.nuget.org/packages/ServiceStack.Server) - Supports all [OrmLite's RDBMS providers](/ormlite/) for using an existing RDBMS as a distributed cache. * [Memcached](https://nuget.org/packages/ServiceStack.Caching.Memcached) - The original, tried and tested distributed memory caching provider. * [Aws DynamoDB](https://www.nuget.org/packages/ServiceStack.Aws/) - Uses Amazon's Dynamo DB backend hosted on Amazon Web Services * [Azure Table Storage](/azure#virtual-filesystem-backed-by-azure-blob-storage) - Uses Azure Table Storage for when your application is hosted on Azure. ### Async Cache Clients All remote Caching Providers also implement the [ICacheClientAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Caching/ICacheClientAsync.cs) async APIs whilst any other `ICacheClient` only providers like the local in-memory `MemoryCacheClient` are still able to use the `ICacheClientAsync` interface as they'll return an [Async Wrapper](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Caching/CacheClientAsyncWrapper.cs) over the underlying sync APIs. So even if you're currently only using `MemoryCacheClient` or your own `ICacheClient` sync implementation, you can still use the async Caching Provider API now and easily switch to an async caching provider in future without code changes. The Async Caching Provider APIs are accessible via the `CacheAsync` property in ServiceStack `Service` or `ServiceStackController` classes, e.g: ```csharp public async Task<object> Any(MyRequest request) { var item = await CacheAsync.GetAsync<Item>("key"); //.... } public class HomeController : ServiceStackController { public async Task<ActionResult> Index() { var item = await CacheAsync.GetAsync<Item>("key"); } } ``` Whilst outside of ServiceStack you can `AppHost.GetCacheClientAsync()`, e.g: ```csharp var cache = HostContext.AppHost.GetCacheClientAsync(); var item = await cache.GetAsync<Item>("key"); ``` ### Configure Caching Providers To configure which cache should be used, the particular client has to be registered in the IoC container against the `ICacheClient` interface: ### Memory cache: By default ServiceStack registers an MemoryCacheClient by default when no `ICacheClient` is registered so no registration is necessary. ```csharp //services.AddSingleton<ICacheClient>(new MemoryCacheClient()); ``` Even if you have an alternative `ICacheClient` registered you can still access the in memory cache via the `LocalCache` property in your Services and ServiceStack MVC Controllers or anywhere else via the `HostContext.AppHost.GetMemoryCacheClient()` singleton as well as `[CacheResponse(UseLocalCache=true)]` when using the [Cache Response Attribute](/cacheresponse-attribute). ### Redis ```csharp services.AddSingleton<IRedisClientsManager>(c => new RedisManagerPool("localhost:6379")); services.AddSingleton(c => c.GetRequiredService<IRedisClientsManager>().GetCacheClient()); ``` ##### NuGet Package: [ServiceStack.Redis](http://www.nuget.org/packages/ServiceStack.Redis) ### OrmLite ```csharp //Register OrmLite Db Factory if not already services.AddSingleton<IDbConnectionFactory>(c => new OrmLiteConnectionFactory(connString, SqlServerDialect.Provider)); services.AddSingleton<ICacheClient, OrmLiteCacheClient>(); //Create 'CacheEntry' RDBMS table if it doesn't exist already appHost.Resolve<ICacheClient>().InitSchema(); ``` #### SQL Server Memory Optimized Cache SQL Server's Memory Optimized support can be used to improve the performance of `OrmLiteCacheClient` by configuring it to use the above In Memory Table Schema instead, e.g: ```csharp services.AddSingleton<ICacheClient>(c => new OrmLiteCacheClient<SqlServerMemoryOptimizedCacheEntry>()); ``` ##### NuGet Package: [ServiceStack.Server](http://www.nuget.org/packages/ServiceStack.Server) ### Memcached ```csharp services.AddSingleton<ICacheClient>( new MemcachedClientCache(new[] { "127.0.0.0" }); //Add Memcached hosts ``` ##### NuGet Package: [ServiceStack.Caching.Memcached](http://www.nuget.org/packages/ServiceStack.Caching.Memcached) ### AWS DynamoDB ```csharp var awsDb = new AmazonDynamoDBClient( AWS_ACCESS_KEY, AWS_SECRET_KEY, RegionEndpoint.USEast1); services.AddSingleton<IPocoDynamo>(new PocoDynamo(awsDb)); services.AddSingleton<ICacheClient>(c => new DynamoDbCacheClient(c.GetRequiredService<IPocoDynamo>())); var cache = appHost.Resolve<ICacheClient>(); cache.InitSchema(); ``` ##### NuGet Package: [ServiceStack.Aws](http://www.nuget.org/packages/ServiceStack.Aws) ### Azure: ```csharp services.AddSingleton<ICacheClient>(new AzureTableCacheClient(cacheConnStr)); ``` ##### NuGet Package: [ServiceStack.Azure](http://www.nuget.org/packages/ServiceStack.Azure) ### Multi CacheClient The `MultiCacheClient` can be used to utilize a write-through multi-tiered cache client where all "writes" are made to all registered cache providers whilst "reads" are only accessed until a value exists. E.g. you can register a local memory and redis server backed Cache Client with: ```csharp services.AddSingleton<ICacheClient>(c => new MultiCacheClient( new MemoryCacheClient(), c.GetRequiredService<IRedisClientsManager>().GetCacheClient())); ``` ## Cache a response of a service To cache a response you simply have to call `ToOptimizedResultUsingCache` which is an extension method existing in `ServiceStack.ServiceHost`. In your service: ```csharp public class OrdersService : Service { public object Get(CachedOrders request) { var cacheKey = "unique_key_for_this_request"; return base.Request.ToOptimizedResultUsingCache(base.Cache,cacheKey,()=> { //Delegate is executed if item doesn't exist in cache //Any response DTO returned here will be cached automatically }); } } ``` ::: info Tip There exists a class named [UrnId](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Common/UrnId.cs) which provides helper methods to create unique keys for an object ::: `ToOptimizedResultUsingCache` also has an overload which provides a parameter to set the timespan when the cache should be deleted (marked as expired). If now a client calls the same service method a second time and the cache expired, the provided delegate, which returns the response DTO, will be executed a second time. ```csharp var cacheKey = "some_unique_key"; //Cache should be deleted in 1h var expireInTimeSpan = new TimeSpan(1, 0, 0); return base.Request.ToOptimizedResultUsingCache( base.Cache, cacheKey, expireInTimeSpan, ...) ``` ## Delete cached responses If now for example an order gets updated and the order was cached before the update, the webservice will still return the same result, because the cache doesn't know that the order has been updated. So there are two options: * Use **time based** caching (and expire cache earlier) * Cache on **validity** ::: info When the cache is based on **validity** the caches are invalidated manually (e.g. when a user modified his profile, > clear his cache) which means you always get the latest version and you never need to hit the database again to rehydrate the cache if it hasn't changed, which will save resources ::: So if the order gets updated, you should delete the cache manually: ```csharp public class CachedOrdersService : Service { public async Task Put(CachedOrders request) { //The order gets updated... var cacheKey = "some_unique_key_for_order"; await CacheAsync.ClearCachesAsync(cacheKey); } } ``` If now the client calls the web service to request the order, he'll get the latest version. ### LocalCache As it sometimes beneficial to have access to a local in-memory Cache in addition to your registered `ICacheClient` [Caching Provider](/caching) we also pre-register a `MemoryCacheClient` that all your Services now have access to from the `LocalCache` property, i.e: ```csharp MemoryCacheClient LocalCache { get; } ``` This doesn't affect any existing functionality that utilizes a cache like Sessions which continue to use your registered `ICacheClient`, but it does let you change which cache you want different responses to use, e.g: ```csharp var cacheKey = "unique_key_for_this_request"; return base.Request.ToOptimizedResultUsingCache(LocalCache, cacheKey, () => { //Delegate is executed if item doesn't exist in cache }); ``` Or if you're using the [CacheResponse](/cacheresponse-attribute) attribute you can specify to cache responses in the local cache with: ```csharp [CacheResponse(LocalCache = true)] public object Any(MyRequest request) { ... } ``` ::: info If you don't register a `ICacheClient` ServiceStack automatically registers a `MemoryCacheClient` for you which will also refer to the same instance registered for `LocalCache` ::: ## [ICacheClientExtended](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Caching/ICacheClientExtended.cs) The [ICacheClientExtended](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Caching/ICacheClientExtended.cs) API is used to to provide additional non-core functionality to our most popular [Caching providers](/caching): * Redis * OrmLite RDBMS * In Memory * AWS * Azure The new API's are added as Extension methods on `ICacheClient` so they're easily accessible without casting, the new API's available include: * GetKeysByPattern(pattern) - return keys matching a wildcard pattern * GetAllKeys() - return all keys in the caching provider * GetKeysStartingWith() - Streaming API to return all keys Starting with a prefix With these new API's you can now easily get all active User Sessions using any of the supported Caching providers above with: ```csharp var sessionPattern = IdUtils.CreateUrn<IAuthSession>(""); //= urn:iauthsession: var sessionKeys = Cache.GetKeysStartingWith(sessionPattern).ToList(); var allSessions = Cache.GetAll<IAuthSession>(sessionKeys); ``` ### CacheClient with Prefix The `CacheClientWithPrefix` class lets you decorate any `ICacheClient` to prefix all cache keys using the `.WithPrefix()` extension method. This could be used to easily enable multi-tenant usage of a single redis instance, e.g: ```csharp services.AddSingleton(c => c.GetRequiredService<IRedisClientsManager>().GetCacheClient().WithPrefix("site1")); ``` ## Live Example and code A live demo of the ICacheClient is available in [The ServiceStack.Northwind's example project](https://northwind.netcore.io/). Here are some requests to cached services: * [/customers](https://northwind.netcore.io/cached/customers) * [/customers/ALFKI](https://northwind.netcore.io/cached/customers/ALFKI) * [/customers/ALFKI/orders](https://northwind.netcore.io/cached/customers/ALFKI/orders) Which are simply existing web services wrapped using **ICacheClient** that are contained in [CachedServices.cs](https://github.com/ServiceStack/ServiceStack.Examples/blob/master/src/ServiceStack.Northwind/ServiceStack.Northwind.ServiceInterface/CachedServices.cs) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/cancellable-requests.md --- slug: cancellable-requests title: Cancellable Requests --- The Cancellable Requests Feature makes it easy to design long-running Services that are cancellable with an external Web Service Request. To enable this feature, register the `CancellableRequestsFeature` plugin: ```csharp Plugins.Add(new CancellableRequestsFeature()); ``` ## Designing a Cancellable Service Then in your Service you can wrap your implementation within a disposable `ICancellableRequest` block which encapsulates a Cancellation Token that you can watch to determine if the Request has been cancelled, e.g: ```csharp public object Any(TestCancelRequest req) { using (var cancellableRequest = base.Request.CreateCancellableRequest()) { //Simulate long-running request while (true) { cancellableRequest.Token.ThrowIfCancellationRequested(); Thread.Sleep(100); } } } ``` ## Cancelling a remote Service To be able to cancel a Server request on the client, the client must first **Tag** the request which it does by assigning the `X-Tag` HTTP Header with a user-defined string in a Request Filter before calling a cancellable Service, e.g: ```csharp var tag = Guid.NewGuid().ToString(); var client = new JsonServiceClient(baseUri) { RequestFilter = req => req.Headers[HttpHeaders.XTag] = tag }; var responseTask = client.PostAsync(new TestCancelRequest()); ``` Then at anytime whilst the Service is still executing the remote request can be cancelled by calling the `CancelRequest` Service with the specified **Tag**, e.g: ```csharp var cancelResponse = client.Post(new CancelRequest { Tag = tag }); ``` If it was successfully cancelled it will return a `CancelRequestResponse` DTO with the elapsed time of how long the Service ran for. Otherwise if the remote Service had completed or never existed it will throw **404 Not Found** in a `WebServiceException`. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ai-server/chat.md --- title: AI Server API Usage --- AI Server provides a unified API to process requests for AI services to access LLMs, Image Generation, Transcription, and more. The API is designed to be simple to use and easy to integrate into your applications providing many supported languages and frameworks. ## Chat UI AI Server's Chat UI lets upi send Open AI Chat requests with custom system prompts to any of its active LLMs: <div class="not-prose"> <h3 class="text-4xl text-center text-indigo-800 pb-3"><span class="text-gray-300">https://localhost:5006</span>/Chat </h3> </div> ![](/img/pages/ai-server/uis/Chat.webp) ## Making a Chat Request To make a chat request to AI Server, you can use the `/api/OpenAiChatCompletion` endpoint. This endpoint requires a `OpenAiChatCompletion` request DTO that contains a property matching the `OpenAI` API. ### Sync Open AI Chat Completion ::include ai-server/cs/ai-server-compatible-1.cs.md:: This request will generate a response from the `llama3:8b` model using the `system` and `user` messages provided. This will perform the operation synchronously, waiting for the response to be generated before returning it to the client. Alternatively, you can call the same endpoint asynchronously by using the `/api/QueueOpenAiChatCompletion` endpoint. This will queue the request for processing and return a URL to check the status of the request and download the response when it's ready. ### Queued Open AI Chat Completion ::include ai-server/cs/queue-openai-chat-completion-1.cs.md:: Additional optional features on the request to enhance the usage of AI Server include: - **RefId**: A unique identifier for the request specified by the client to more easily track the progress of the request. - **Tag**: A tag to help categorize the request for easier tracking. `RefId` and `Tag` are available on both synchronous and asynchronous requests, where as Queue requests also support: - **ReplyTo**: A URL to send a POST request to when the request is complete. ## Open AI Chat with ReplyTo Callback The Queued API also accepts a **ReplyTo Web Callback** for a more reliable push-based App integration where responses are posted back to a custom URL Endpoint: ```csharp var correlationId = Guid.NewGuid().ToString("N"); var response = client.Post(new QueueOpenAiChatCompletion { //... ReplyTo = $"https://example.org/api/OpenAiChatResponseCallback?CorrelationId=${correlationId}" }); ``` Your callback can add any additional metadata on the callback to assist your App in correlating the response with the initiating request which just needs to contain the properties of the `OpenAiChatResponse` you're interested in along with any metadata added to the callback URL, e.g: ```csharp public class OpenAiChatResponseCallback : IPost, OpenAiChatResponse, IReturnVoid { public Guid CorrelationId { get; set; } } public object Post(OpenAiChatResponseCallback request) { // Handle OpenAiChatResponse callabck } ``` Unless your callback API is restricted to only accept requests from your AI Server, you should include a unique Id like a `Guid` in the callback URL that can be validated against an initiating request to ensure the callback can't be spoofed. ## Using the AI Server Request DTOs with other OpenAI compatible APIs One advantage of using AI Server is that it provides a common set of request DTOs in 11 different languages that are compatible with OpenAI's API. This allows you to switch between OpenAI and AI Server without changing your client code. This means you can switch to using typed APIs in your preferred language with your existing service providers OpenAI compatible APIs, and optionally switch to AI Server when you're ready to self-host your AI services for better value. ::include ai-server/cs/open-ai-requests-1.cs.md:: This shows usage of the `OpenAiChat` request DTO directly with OpenAI's API using the ServiceStack `JsonApiClient`, so you get the benefits of using typed APIs in your preferred language with your existing service providers OpenAI compatible APIs. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/auth/client-auth.md --- title: Service Clients Authentication --- ## Authenticating with JavaScript or TypeScript Service Clients Typically when using ServiceStack's **@servicestack/client** `JsonServiceClient` it will utilize the browser's authenticated cookies where you'll be able to make authenticated requests as the currently Authenticated User in your Application: ```ts import { JsonServiceClient } from "@servicestack/client"; const client = new JsonServiceClient() // Uses browser's authenticated cookies by default const api = await client.api(new Secured()) ``` Alternatively you can also authenticate with JavaScript by sending an `Authenticate` Request, e.g: ```ts import { JsonServiceClient } from "@servicestack/client"; const client = new JsonServiceClient() const apiAuth = await client.api(new Authenticate({ provider:'credentials', userName, password })) if (apiAuth.suceeded) { const api = await client.api(new Secured()) } ``` As the cookies are shared with the browser, this will also authenticate the browser session where you'll be able to view protected Blazor, MVC or Razor Pages after successful authentication. ## Authenticating with C#/.NET Service Clients On the client you can use the [C#/.NET Service Clients](/csharp-client) to easily consume your authenticated Services. You can authenticate against your registered **Credentials** Auth Provider by submitting a populated `Authenticate` Request DTO, e.g: ```csharp var client = new JsonServiceClient(BaseUrl); var apiAuth = await client.ApiAsync(new Authenticate { provider = "credentials", UserName = userName, Password = password, }); if (apiAuth.Succeeded) { //... } ``` If authentication was successful the Service Client `client` instance will be populated with authenticated session cookies which then allows calling Authenticated services, e.g: ```csharp var api = await client.ApiAsync(new GetActiveUserId()); ``` If you've also registered the `BasicAuthProvider` it will enable your Services to accept [HTTP Basic Authentication](https://en.wikipedia.org/wiki/Basic_access_authentication) which is built-in the Service Clients that you can populate on the Service Client with: ```csharp client.UserName = userName; client.Password = password ``` Which will also let you access protected Services, e.g: ```csharp var response = client.Get(new GetActiveUserId()); ``` Although behind-the-scenes it ends up making 2 requests, 1st request sends a normal request which will get rejected with a `401 Unauthorized` and if the Server indicates it has the `BasicAuthProvider` enabled it will resend the request with the HTTP Basic Auth credentials. You could instead save the latency of the additional auth challenge request by specifying the client should always send the Basic Auth with every request: ```csharp client.AlwaysSendBasicAuthHeader = true; ``` ## Authenticating with HTTP To Authenticate against your registered **Credentials** Auth Provider you can **POST** a raw JSON body: **POST** localhost:60339/auth/credentials?format=json ```json { "UserName": "admin", "Password": "p@55wOrd", "RememberMe": true } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/redis/client-managers.md --- slug: client-managers title: Managing connections --- ## Redis Connection Strings Redis Connection strings have been expanded to support the more versatile URI format which is now able to capture most of Redis Client settings in a single connection string (akin to DB Connection strings). Redis Connection Strings supports multiple URI-like formats, from a simple **hostname** or **IP Address and port** pair to a fully-qualified **URI** with multiple options specified on the QueryString. Some examples of supported formats: ``` localhost 127.0.0.1:6379 redis://localhost:6379 password@localhost:6379 clientid:password@localhost:6379 redis://clientid:password@localhost:6380?ssl=true&db=1 ``` ::: info More examples can be seen in [ConfigTests.cs](https://github.com/ServiceStack/ServiceStack.Redis/blob/master/tests/ServiceStack.Redis.Tests/ConfigTests.cs) ::: Any additional configuration can be specified as QueryString parameters. The full list of options that can be specified include: <table> <tr> <td><b>Ssl</b></td> <td>bool</td> <td>If this is an SSL connection</td> </tr> <tr> <td><b>Db</b></td> <td>int</td> <td>The Redis DB this connection should be set to</td> </tr> <tr> <td><b>Client</b></td> <td>string</td> <td>A text alias to specify for this connection for analytic purposes</td> </tr> <tr> <td><b>Username</b></td> <td>string</td> <td>Redis Username when using ACLs</td> </tr> <tr> <td><b>Password</b></td> <td>string</td> <td>UrlEncoded version of the Password for this connection</td> </tr> <tr> <td><b>ConnectTimeout</b></td> <td>int</td> <td>Timeout in ms for making a TCP Socket connection</td> </tr> <tr> <td><b>SendTimeout</b></td> <td>int</td> <td>Timeout in ms for making a synchronous TCP Socket Send</td> </tr> <tr> <td><b>ReceiveTimeout</b></td> <td>int</td> <td>Timeout in ms for waiting for a synchronous TCP Socket Receive</td> </tr> <tr> <td><b>IdleTimeOutSecs</b></td> <td>int</td> <td>Timeout in Seconds for an Idle connection to be considered active</td> </tr> <tr> <td><b>NamespacePrefix</b></td> <td>string</td> <td>Use a custom prefix for ServiceStack.Redis internal index colletions</td> </tr> </table> When using [Redis ACLs](https://redis.io/docs/manual/security/acl/) the Username needs to specified on the QueryString, e.g: ```csharp var connString = $"redis://{Host}?ssl=true&username={Username}&password={Password.UrlEncode()}"; var redisManager = new RedisManagerPool(connString); ``` ### [Connecting to Azure Redis](/ssl-redis-azure) As connecting to [Azure Redis Cache](http://azure.microsoft.com/en-us/services/cache/) via SSL was the primary use-case for this feature, we've added a new [Getting connected to Azure Redis via SSL](/ssl-redis-azure) to help you get started. ## Redis Client Managers The recommended way to access `RedisClient` instances is to use one of the available Thread-Safe Client Managers below. Client Managers are connection factories which should be registered as a Singleton either in your IOC or static class. ### RedisManagerPool With the enhanced Redis URI Connection Strings we've been able to simplify and streamline the existing `PooledRedisClientManager` implementation and have extracted it out into a new clients manager called `RedisManagerPool`. In addition to removing all above options on the Client Manager itself, readonly connection strings have also been removed so the configuration ends up much simpler and more aligned with the common use-case: ```csharp container.Register<IRedisClientsManager>(c => new RedisManagerPool(redisConnectionString)); ``` **Pooling Behavior** Any connections required after the maximum Pool size has been reached will be created and disposed outside of the Pool. By not being restricted to a maximum pool size, the pooling behavior in `RedisManagerPool` can maintain a smaller connection pool size at the cost of potentially having a higher opened/closed connection count. ### PooledRedisClientManager If you prefer to define options on the Client Manager itself or you want to provide separate Read/Write and ReadOnly (i.e. Master and Replica) redis-servers, use the `PooledRedisClientManager` instead: ```csharp container.Register<IRedisClientsManager>(c => new PooledRedisClientManager(redisReadWriteHosts, redisReadOnlyHosts) { ConnectTimeout = 100, //... }); ``` **Pooling Behavior** The `PooledRedisClientManager` imposes a maximum connection limit and when its maximum pool size has been reached will instead block on any new connection requests until the next `RedisClient` is released back into the pool. If no client became available within `PoolTimeout`, a Pool `TimeoutException` will be thrown. ## Redis Configuration In addition to configuring Redis Client Managers directly, management and behavior of Redis connections can also configured using the static [RedisConfig.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Redis/src/ServiceStack.Redis/RedisConfig.cs) class, e.g: #### Configure Pool Size of Redis Client Managers ```csharp RedisConfig.DefaultMaxPoolSize = 100; ``` Available Redis Client Configuration options and their defaults: | Property | Default | Description | | - | - | - | | `DefaultConnectTimeout` | **-1** (none) | The default RedisClient Socket ConnectTimeout | | `DefaultSendTimeout` | **-1** (none) | The default RedisClient Socket SendTimeout | | `DefaultReceiveTimeout` | **-1** (none) | The default RedisClient Socket ReceiveTimeout | | `DefaultIdleTimeOutSecs` | **240** seconds | Default Idle TimeOut before a connection is considered to be stale | | `DefaultRetryTimeout` | **10000** ms | The default RetryTimeout for auto retry of failed operations | | `DefaultMaxPoolSize` | **null** (none) | Max Pool Size for Pooled Redis Client Managers (overrides DefaultPoolSizeMultiplier) | | `DefaultPoolSizeMultiplier` | **50** | The default pool size multiplier if no pool size is specified | | `BackOffMultiplier` | **10** ms | The BackOff multiplier failed Auto Retries starts from | | `CommandKeysBatchSize` | **10000** keys | Batch size of keys to include in a single Redis Command (e.g. DEL k1 k2...) | | `VerifyMasterConnections` | **true** | Whether Connections to Master hosts should be verified they're still a master | | `RetryReconnectOnFailedMasters` | **true** | Whether to retry re-connecting on same connection if not a master instance | | `HostLookupTimeoutMs` | **200** ms | The ConnectTimeout on clients used to find the next available host | | `AssumeServerVersion` | **null** (none) | Skip ServerVersion Checks by specifying Min Version number | | `DeactivatedClientsExpiry` | **0** seconds | How long to hold deactivated clients for before disposing their connection | | `EnableVerboseLogging` | **false** | Whether Debug Logging should log detailed Redis operations | | `AssertAccessOnlyOnSameThread` | **false** | Assert all access using pooled RedisClient instance is limited to same thread | ### [ServiceStack.Redis SSL Support](/ssl-redis-azure) ServiceStack.Redis supports **SSL connections** making it suitable for accessing remote Redis server instances over a **secure SSL connection**. ![Azure Redis Cache](https://github.com/ServiceStack/Assets/raw/master/img/wikis/redis/azure-redis-instance.png) #### Specify SSL Protocol Support for changing the Ssl Protocols used for encrypted SSL connections can be set on the connection string using the `sslprotocols` modifier, e.g: ```csharp var connString = $"redis://{Host}?ssl=true&sslprotocols=Tls12&password={Password.UrlEncode()}"; var redisManager = new RedisManagerPool(connString); using var client = redisManager.GetClient(); //... ``` If needed the `RedisConfig` Certificate selecation and validation callbacks can be used to [Validate SSL Certificates](http://msdn.microsoft.com/en-us/library/office/dd633677(v=exchg.80).aspx): ```csharp RedisConfig.CertificateSelectionCallback = (object sender, string targetHost, X509CertificateCollection localCertificates, X509Certificate remoteCertificate, string[] acceptableIssuers) => ... RedisConfig.CertificateValidationCallback = (object sender, X509Certificate certificate, X509Chain chain, SslPolicyErrors sslPolicyErrors) => ... ``` ### Read Only Clients By default resolving a RedisClient with `GetRedisClient()` or `GetRedisClientAsync()` will return a client connected to the configured primary (master) host, if you also have replica (slave) hosts configured, you can access it with the `GetReadOnlyClient()` or `GetReadOnlyClientAsync()` APIs, e.g: ```csharp using var redisReadOnly = clientsManager.GetReadOnlyClient(); ``` ### BasicRedisClientManager If don't want to use connection pooling (i.e. you're accessing a local redis-server instance) you can use a basic (non-pooled) Clients Manager which creates a new `RedisClient` instance each time: ```csharp container.Register<IRedisClientsManager>(c => new BasicRedisClientManager(redisConnectionString)); ``` ### Accessing the Redis Client Once registered, accessing the RedisClient is the same in all Client Managers, e.g: ```csharp var clientsManager = container.Resolve<IRedisClientsManager>(); using var redis = clientsManager.GetClient(); redis.IncrementValue("counter"); List<string> days = redis.GetAllItemsFromList("days"); //Access Typed API var redisTodos = redis.As<Todo>(); redisTodos.Store(new Todo { Id = redisTodos.GetNextSequence(), Content = "Learn Redis", }); var todo = redisTodos.GetById(1); //Access Native Client var redisNative = (IRedisNativeClient)redis; redisNative.Incr("counter"); List<string> days = redisNative.LRange("days", 0, -1); ``` A more detailed list of the available RedisClient APIs used in the example can be seen in the C# interfaces below: - [IRedisClientsManager](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisClientsManager.cs) - [IRedisClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisClient.cs) - [IRedisNativeClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisNativeClient.cs) - [IRedisSubscription](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisSubscription.cs) #### Pipeline & Transaction APIs - [IRedisTransaction](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisTransaction.cs) - [IRedisPipelineShared](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Pipeline/IRedisPipelineShared.cs) - [IRedisQueueableOperation](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Pipeline/IRedisQueueableOperation.cs) - [IRedisQueueCompletableOperation](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Pipeline/IRedisQueueCompletableOperation.cs) #### Generic Client APIs - [IRedisTypedClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisTypedClient.cs) - [IRedisHash](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisHash.Generic.cs) - [IRedisList](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisList.Generic.cs) - [IRedisSet](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisSet.Generic.cs) - [IRedisSortedSet](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisSortedSet.Generic.cs) - [IRedisTypedQueueableOperation](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisTypedQueueableOperation.cs) #### Server Collection APIs - [IRedisHash](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisHash.cs) - [IRedisList](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisList.cs) - [IRedisSet](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisSet.cs) - [IRedisSortedSet](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisSortedSet.cs) ### Async Redis The async support in ServiceStack.Redis is designed for optimal efficiency and uses `ValueTask` & other modern Async APIs only available in **.NET Standard 2.0** and **.NET Framework v4.7.2+** projects where there's async API equivalents for most sync APIs as contained within the Async Redis interfaces below: - [IRedisClientsManagerAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisClientsManagerAsync.cs) - [IRedisClientAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisClientAsync.cs) - [IRedisNativeClientAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisNativeClientAsync.cs) - [IRedisSubscriptionAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisSubscriptionAsync.cs) #### Async Pipeline & Transaction APIs - [IRedisTransactionAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisTransactionAsync.cs) - [IRedisPipelineSharedAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Pipeline/IRedisPipelineSharedAsync.cs) - [IRedisQueueableOperationAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Pipeline/IRedisQueueableOperationAsync.cs) - [IRedisQueueCompletableOperationAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Pipeline/IRedisQueueCompletableOperationAsync.cs) #### Async Generic Client APIs - [IRedisTypedClientAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisTypedClientAsync.cs) - [IRedisHashAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisHash.Generic.Async.cs) - [IRedisListAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisList.Generic.Async.cs) - [IRedisSetAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisSet.Generic.Async.cs) - [IRedisSortedSetAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisSortedSet.Generic.Async.cs) - [IRedisTypedTransactionAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisTypedTransactionAsync.cs) - [IRedisTypedQueueableOperationAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/Generic/IRedisTypedQueueableOperationAsync.cs) #### Async Server Collection APIs - [IRedisHashAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisHashAsync.cs) - [IRedisListAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisListAsync.cs) - [IRedisSetAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisSetAsync.cs) - [IRedisSortedSetAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Redis/IRedisSortedSetAsync.cs) ## [Redis GEO](https://github.com/ServiceStackApps/redis-geo) The [release of Redis 3.2.0](http://antirez.com/news/104) brings [GEO capabilities](http://redis.io/commands/geoadd) which will let you store Lat/Long coordinates in Redis and query locations within a specified radius. To demonstrate this functionality we've created a new [Redis GEO Live Demo](https://github.com/ServiceStackApps/redis-geo) which lets you click on anywhere in the U.S. to find the list of nearest cities within a given radius, Live Demo at: https://redis.netcore.io --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/redis/client-usage.md --- slug: client-usage title: Getting Started with Redis Client APIs --- Below is a simple example to give you a flavour of how easy it is to use some of Redis's advanced data structures - in this case Redis Lists: _Full source code of this example is [viewable online](https://github.com/ServiceStack/ServiceStack.Redis/blob/master/tests/ServiceStack.Redis.Tests/ShippersExample.cs)_ ```csharp using var redisClient = new RedisClient(); //Create a 'strongly-typed' API that makes all Redis Value operations to apply against Shippers IRedisTypedClient<Shipper> redis = redisClient.As<Shipper>(); //Redis lists implement IList<T> while Redis sets implement ICollection<T> var currentShippers = redis.Lists["urn:shippers:current"]; var prospectiveShippers = redis.Lists["urn:shippers:prospective"]; currentShippers.Add( new Shipper { Id = redis.GetNextSequence(), CompanyName = "Trains R Us", DateCreated = DateTime.UtcNow, ShipperType = ShipperType.Trains, UniqueRef = Guid.NewGuid() }); currentShippers.Add( new Shipper { Id = redis.GetNextSequence(), CompanyName = "Planes R Us", DateCreated = DateTime.UtcNow, ShipperType = ShipperType.Planes, UniqueRef = Guid.NewGuid() }); var lameShipper = new Shipper { Id = redis.GetNextSequence(), CompanyName = "We do everything!", DateCreated = DateTime.UtcNow, ShipperType = ShipperType.All, UniqueRef = Guid.NewGuid() }; currentShippers.Add(lameShipper); Dump("ADDED 3 SHIPPERS:", currentShippers); currentShippers.Remove(lameShipper); Dump("REMOVED 1:", currentShippers); prospectiveShippers.Add( new Shipper { Id = redis.GetNextSequence(), CompanyName = "Trucks R Us", DateCreated = DateTime.UtcNow, ShipperType = ShipperType.Automobiles, UniqueRef = Guid.NewGuid() }); Dump("ADDED A PROSPECTIVE SHIPPER:", prospectiveShippers); redis.PopAndPushBetweenLists(prospectiveShippers, currentShippers); Dump("CURRENT SHIPPERS AFTER POP n' PUSH:", currentShippers); Dump("PROSPECTIVE SHIPPERS AFTER POP n' PUSH:", prospectiveShippers); var poppedShipper = redis.PopFromList(currentShippers); Dump("POPPED a SHIPPER:", poppedShipper); Dump("CURRENT SHIPPERS AFTER POP:", currentShippers); //reset sequence and delete all lists redis.SetSequence(0); redis.Remove(currentShippers, prospectiveShippers); Dump("DELETING CURRENT AND PROSPECTIVE SHIPPERS:", currentShippers); ``` EXAMPLE OUTPUT: ``` ADDED 3 SHIPPERS: Id:1,CompanyName:Trains R Us,ShipperType:Trains,DateCreated:2010-01-31T11:53:37.7169323Z,UniqueRef:d17c5db0415b44b2ac5da7b6ebd780f5 Id:2,CompanyName:Planes R Us,ShipperType:Planes,DateCreated:2010-01-31T11:53:37.799937Z,UniqueRef:e02a73191f4b4e7a9c44eef5b5965d06 Id:3,CompanyName:We do everything!,ShipperType:All,DateCreated:2010-01-31T11:53:37.8009371Z,UniqueRef:d0c249bbbaf84da39fc4afde1b34e332 REMOVED 1: Id:1,CompanyName:Trains R Us,ShipperType:Trains,DateCreated:2010-01-31T11:53:37.7169323Z,UniqueRef:d17c5db0415b44b2ac5da7b6ebd780f5 Id:2,CompanyName:Planes R Us,ShipperType:Planes,DateCreated:2010-01-31T11:53:37.799937Z,UniqueRef:e02a73191f4b4e7a9c44eef5b5965d06 ADDED A PROSPECTIVE SHIPPER: Id:4,CompanyName:Trucks R Us,ShipperType:Automobiles,DateCreated:2010-01-31T11:53:37.8539401Z,UniqueRef:67d7d4947ebc4b0ba5c4d42f5d903bec CURRENT SHIPPERS AFTER POP n' PUSH: Id:4,CompanyName:Trucks R Us,ShipperType:Automobiles,DateCreated:2010-01-31T11:53:37.8539401Z,UniqueRef:67d7d4947ebc4b0ba5c4d42f5d903bec Id:1,CompanyName:Trains R Us,ShipperType:Trains,DateCreated:2010-01-31T11:53:37.7169323Z,UniqueRef:d17c5db0415b44b2ac5da7b6ebd780f5 Id:2,CompanyName:Planes R Us,ShipperType:Planes,DateCreated:2010-01-31T11:53:37.799937Z,UniqueRef:e02a73191f4b4e7a9c44eef5b5965d06 PROSPECTIVE SHIPPERS AFTER POP n' PUSH: POPPED a SHIPPER: Id:2,CompanyName:Planes R Us,ShipperType:Planes,DateCreated:2010-01-31T11:53:37.799937Z,UniqueRef:e02a73191f4b4e7a9c44eef5b5965d06 CURRENT SHIPPERS AFTER POP: Id:4,CompanyName:Trucks R Us,ShipperType:Automobiles,DateCreated:2010-01-31T11:53:37.8539401Z,UniqueRef:67d7d4947ebc4b0ba5c4d42f5d903bec Id:1,CompanyName:Trains R Us,ShipperType:Trains,DateCreated:2010-01-31T11:53:37.7169323Z,UniqueRef:d17c5db0415b44b2ac5da7b6ebd780f5 DELETING CURRENT AND PROSPECTIVE SHIPPERS: ``` More examples are available in the [RedisExamples Redis examples page] and in the comprehensive [test suite](https://github.com/ServiceStack/ServiceStack.Redis/tree/master/tests/ServiceStack.Redis.Tests) ## Speed One of the best things about Redis is the speed - it is quick. [This example](https://github.com/ServiceStack/ServiceStack.Redis/blob/master/tests/ServiceStack.Redis.Tests/RedisClientTests.cs) below stores and gets the entire [Northwind database](http://code.google.com/p/servicestack/source/browse/trunk/Common/Northwind.Benchmarks/Northwind.Common/DataModel/NorthwindData.cs) (3202 records) in less *1.2 secs* - we've never had it so quick! _(Running inside a VS.NET/R# unit test on a 3 year old iMac)_ ```csharp using var client = new RedisClient(); var before = DateTime.Now; client.StoreAll(NorthwindData.Categories); client.StoreAll(NorthwindData.Customers); client.StoreAll(NorthwindData.Employees); client.StoreAll(NorthwindData.Shippers); client.StoreAll(NorthwindData.Orders); client.StoreAll(NorthwindData.Products); client.StoreAll(NorthwindData.OrderDetails); client.StoreAll(NorthwindData.CustomerCustomerDemos); client.StoreAll(NorthwindData.Regions); client.StoreAll(NorthwindData.Territories); client.StoreAll(NorthwindData.EmployeeTerritories); Console.WriteLine("Took {0}ms to store the entire Northwind database ({1} records)", (DateTime.Now - before).TotalMilliseconds, totalRecords); before = DateTime.Now; var categories = client.GetAll<Category>(); var customers = client.GetAll<Customer>(); var employees = client.GetAll<Employee>(); var shippers = client.GetAll<Shipper>(); var orders = client.GetAll<Order>(); var products = client.GetAll<Product>(); var orderDetails = client.GetAll<OrderDetail>(); var customerCustomerDemos = client.GetAll<CustomerCustomerDemo>(); var regions = client.GetAll<Region>(); var territories = client.GetAll<Territory>(); var employeeTerritories = client.GetAll<EmployeeTerritory>(); Console.WriteLine("Took {0}ms to get the entire Northwind database ({1} records)", (DateTime.Now - before).TotalMilliseconds, totalRecords); /* == EXAMPLE OUTPUT == Took 1020.0583ms to store the entire Northwind database (3202 records) Took 132.0076ms to get the entire Northwind database (3202 records) */ ``` Note: The total time taken includes an extra Redis operation for each record to store the id in a Redis set for each type as well as serializing and de-serializing each record using Service Stack's TypeSerializer. ## Lex Operations The new [ZRANGEBYLEX](http://redis.io/commands/zrangebylex) sorted set operations allowing you to query a sorted set lexically have been added. A good showcase for this is available on [autocomplete.redis.io](http://autocomplete.redis.io/). These new operations are available as a 1:1 mapping with redis-server on `IRedisNativeClient`: ```csharp public interface IRedisNativeClient { ... byte[][] ZRangeByLex(string setId, string min, string max, int? skip, int? take); long ZLexCount(string setId, string min, string max); long ZRemRangeByLex(string setId, string min, string max); } ``` And the more user-friendly APIs under `IRedisClient`: ```csharp public interface IRedisClient { ... List<string> SearchSortedSet(string setId, string start=null, string end=null); long SearchSortedSetCount(string setId, string start=null, string end=null); long RemoveRangeFromSortedSetBySearch(string setId, string start=null, string end=null); } ``` Just like NuGet version matchers, Redis uses `[` char to express inclusiveness and `(` char for exclusiveness. Since the `IRedisClient` APIs defaults to inclusive searches, these two APIs are the same: ```csharp Redis.SearchSortedSetCount("zset", "a", "c") Redis.SearchSortedSetCount("zset", "[a", "[c") ``` Alternatively you can specify one or both bounds to be exclusive by using the `(` prefix, e.g: ```csharp Redis.SearchSortedSetCount("zset", "a", "(c") Redis.SearchSortedSetCount("zset", "(a", "(c") ``` More API examples are available in [LexTests.cs](https://github.com/ServiceStack/ServiceStack.Redis/blob/master/tests/ServiceStack.Redis.Tests/LexTests.cs). ## HyperLog API The development branch of Redis server (available when v3.0 is released) includes an ingenious algorithm to approximate the unique elements in a set with maximum space and time efficiency. For details about how it works see Redis's creator Salvatore's blog who [explains it in great detail](http://antirez.com/news/75). Essentially it lets you maintain an efficient way to count and merge unique elements in a set without having to store its elements. A Simple example of it in action: ```csharp redis.AddToHyperLog("set1", "a", "b", "c"); redis.AddToHyperLog("set1", "c", "d"); var count = redis.CountHyperLog("set1"); //4 redis.AddToHyperLog("set2", "c", "d", "e", "f"); redis.MergeHyperLogs("mergedset", "set1", "set2"); var mergeCount = redis.CountHyperLog("mergedset"); //6 ``` ## Scan APIs Redis v2.8 introduced a beautiful new [SCAN](http://redis.io/commands/scan) operation that provides an optimal strategy for traversing a redis instance entire keyset in managable-size chunks utilizing only a client-side cursor and without introducing any server state. It's a higher performance alternative and should be used instead of [KEYS](http://redis.io/commands/keys) in application code. SCAN and its related operations for traversing members of Sets, Sorted Sets and Hashes are now available in the Redis Client in the following APIs: ```csharp public interface IRedisClient { ... IEnumerable<string> ScanAllKeys(string pattern = null, int pageSize = 1000); IEnumerable<string> ScanAllSetItems(string setId, string pattern = null, int pageSize = 1000); IEnumerable<KeyValuePair<string, double>> ScanAllSortedSetItems(string setId, string pattern = null, int pageSize = 1000); IEnumerable<KeyValuePair<string, string>> ScanAllHashEntries(string hashId, string pattern = null, int pageSize = 1000); } public interface IRedisClientAsync { IAsyncEnumerable<string> ScanAllKeysAsync(string pattern = null, int pageSize, CancellationToken ct); IAsyncEnumerable<string> ScanAllSetItemsAsync(string setId, string pattern = null, int pageSize, CancellationToken ct); IAsyncEnumerable<KeyValuePair<string, double>> ScanAllSortedSetItemsAsync(string setId, string pattern = null, int pageSize, ct); IAsyncEnumerable<KeyValuePair<string, string>> ScanAllHashEntriesAsync(string hashId, string pattern = null, int pageSize, ct); } //Low-level API public interface IRedisNativeClient { ... ScanResult Scan(ulong cursor, int count = 10, string match = null); ScanResult SScan(string setId, ulong cursor, int count = 10, string match = null); ScanResult ZScan(string setId, ulong cursor, int count = 10, string match = null); ScanResult HScan(string hashId, ulong cursor, int count = 10, string match = null); } public interface IRedisNativeClientAsync { ValueTask<ScanResult> ScanAsync(ulong cursor, int count = 10, string match = null, CancellationToken ct); ValueTask<ScanResult> SScanAsync(string setId, ulong cursor, int count = 10, string match = null, CancellationToken ct); ValueTask<ScanResult> ZScanAsync(string setId, ulong cursor, int count = 10, string match = null, CancellationToken ct); ValueTask<ScanResult> HScanAsync(string hashId, ulong cursor, int count = 10, string match = null, CancellationToken ct); } ``` The `IRedisClient` provides a higher-level API that abstracts away the client cursor to expose a lazy Enumerable sequence to provide an optimal way to stream scanned results that integrates nicely with LINQ, e.g: ```csharp var scanUsers = Redis.ScanAllKeys("urn:User:*"); var sampleUsers = scanUsers.Take(10000).ToList(); //Stop after retrieving 10000 user keys ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/redis/client.md --- slug: client title: Redis Client --- Details of the IRedisClient API implemented by the [ServiceStack.Redis client](https://github.com/ServiceStack/ServiceStack.Redis) ## Introduction This is a friendly, more descriptive API implemented by the [ServiceStack.Redis client](https://github.com/ServiceStack/ServiceStack.Redis) that provides access to key values as strings (or collection of strings for Redis lists and sets). Use this API if you just need to access values as strings or you want to have control over your own text serialization format. ## API ```csharp public interface IRedisClient : IEntityStore, ICacheClient { //Basic Redis Connection operations long Db { get; set; } long DbSize { get; } Dictionary<string, string> Info { get; } DateTime LastSave { get; } string Host { get; } int Port { get; } int ConnectTimeout { get; set; } int RetryTimeout { get; set; } int RetryCount { get; set; } int SendTimeout { get; set; } string Password { get; set; } bool HadExceptions { get; } void Save(); void SaveAsync(); void Shutdown(); void RewriteAppendOnlyFileAsync(); void FlushDb(); //Basic Redis Connection Info string this[string key] { get; set; } List<string> GetAllKeys(); void SetEntry(string key, string value); void SetEntry(string key, string value, TimeSpan expireIn); bool SetEntryIfNotExists(string key, string value); void SetAll(IEnumerable<string> keys, IEnumerable<string> values); void SetAll(Dictionary<string, string> map); string GetEntry(string key); string GetValue(string key); string GetAndSetEntry(string key, string value); List<string> GetValues(List<string> keys); List<T> GetValues<T>(List<string> keys); Dictionary<string, string> GetValuesMap(List<string> keys); Dictionary<string, T> GetValuesMap<T>(List<string> keys); long AppendToValue(string key, string value); void RenameKey(string fromName, string toName); //store POCOs as hash T GetFromHash<T>(object id); void StoreAsHash<T>(T entity); object StoreObject(object entity); bool ContainsKey(string key); bool RemoveEntry(params string[] args); long IncrementValue(string key); long IncrementValueBy(string key, int count); long IncrementValueBy(string key, long count); double IncrementValueBy(string key, double count); long DecrementValue(string key); long DecrementValueBy(string key, int count); List<string> SearchKeys(string pattern); RedisKeyType GetEntryType(string key); string GetRandomKey(); bool ExpireEntryIn(string key, TimeSpan expireIn); bool ExpireEntryAt(string key, DateTime expireAt); TimeSpan GetTimeToLive(string key); List<string> GetSortedEntryValues(string key, int startingFrom, int endingAt); //Store entities without registering entity ids void WriteAll<TEntity>(IEnumerable<TEntity> entities); //Scan APIs IEnumerable<string> ScanAllKeys(string pattern = null, int pageSize = 1000); IEnumerable<string> ScanAllSetItems(string setId, string pattern = null, int pageSize = 1000); IEnumerable<KeyValuePair<string, double>> ScanAllSortedSetItems(string setId, string pattern = null, int pageSize = 1000); IEnumerable<KeyValuePair<string, string>> ScanAllHashEntries(string hashId, string pattern = null, int pageSize = 1000); //Hyperlog APIs bool AddToHyperLog(string key, params string[] elements); long CountHyperLog(string key); void MergeHyperLogs(string toKey, params string[] fromKeys); /// <summary> /// Returns a high-level typed client API /// </summary> /// <typeparam name="T"></typeparam> IRedisTypedClient<T> As<T>(); IHasNamed<IRedisList> Lists { get; set; } IHasNamed<IRedisSet> Sets { get; set; } IHasNamed<IRedisSortedSet> SortedSets { get; set; } IHasNamed<IRedisHash> Hashes { get; set; } IRedisTransaction CreateTransaction(); IRedisPipeline CreatePipeline(); IDisposable AcquireLock(string key); IDisposable AcquireLock(string key, TimeSpan timeOut); #region Redis pubsub void Watch(params string[] keys); void UnWatch(); IRedisSubscription CreateSubscription(); long PublishMessage(string toChannel, string message); #endregion #region Set operations HashSet<string> GetAllItemsFromSet(string setId); void AddItemToSet(string setId, string item); void AddRangeToSet(string setId, List<string> items); void RemoveItemFromSet(string setId, string item); string PopItemFromSet(string setId); void MoveBetweenSets(string fromSetId, string toSetId, string item); long GetSetCount(string setId); bool SetContainsItem(string setId, string item); HashSet<string> GetIntersectFromSets(params string[] setIds); void StoreIntersectFromSets(string intoSetId, params string[] setIds); HashSet<string> GetUnionFromSets(params string[] setIds); void StoreUnionFromSets(string intoSetId, params string[] setIds); HashSet<string> GetDifferencesFromSet(string fromSetId, params string[] withSetIds); void StoreDifferencesFromSet(string intoSetId, string fromSetId, params string[] withSetIds); string GetRandomItemFromSet(string setId); #endregion #region List operations List<string> GetAllItemsFromList(string listId); List<string> GetRangeFromList(string listId, int startingFrom, int endingAt); List<string> GetRangeFromSortedList(string listId, int startingFrom, int endingAt); List<string> GetSortedItemsFromList(string listId, SortOptions sortOptions); void AddItemToList(string listId, string value); void AddRangeToList(string listId, List<string> values); void PrependItemToList(string listId, string value); void PrependRangeToList(string listId, List<string> values); void RemoveAllFromList(string listId); string RemoveStartFromList(string listId); string BlockingRemoveStartFromList(string listId, TimeSpan? timeOut); ItemRef BlockingRemoveStartFromLists(string[] listIds, TimeSpan? timeOut); string RemoveEndFromList(string listId); void TrimList(string listId, int keepStartingFrom, int keepEndingAt); long RemoveItemFromList(string listId, string value); long RemoveItemFromList(string listId, string value, int noOfMatches); long GetListCount(string listId); string GetItemFromList(string listId, int listIndex); void SetItemInList(string listId, int listIndex, string value); //Queue operations void EnqueueItemOnList(string listId, string value); string DequeueItemFromList(string listId); string BlockingDequeueItemFromList(string listId, TimeSpan? timeOut); ItemRef BlockingDequeueItemFromLists(string[] listIds, TimeSpan? timeOut); //Stack operations void PushItemToList(string listId, string value); string PopItemFromList(string listId); string BlockingPopItemFromList(string listId, TimeSpan? timeOut); ItemRef BlockingPopItemFromLists(string[] listIds, TimeSpan? timeOut); string PopAndPushItemBetweenLists(string fromListId, string toListId); string BlockingPopAndPushItemBetweenLists(string fromListId, string toListId, TimeSpan? timeOut); #endregion #region Sorted Set operations bool AddItemToSortedSet(string setId, string value); bool AddItemToSortedSet(string setId, string value, double score); bool AddRangeToSortedSet(string setId, List<string> values, double score); bool AddRangeToSortedSet(string setId, List<string> values, long score); bool RemoveItemFromSortedSet(string setId, string value); string PopItemWithLowestScoreFromSortedSet(string setId); string PopItemWithHighestScoreFromSortedSet(string setId); bool SortedSetContainsItem(string setId, string value); double IncrementItemInSortedSet(string setId, string value, double incrementBy); double IncrementItemInSortedSet(string setId, string value, long incrementBy); long GetItemIndexInSortedSet(string setId, string value); long GetItemIndexInSortedSetDesc(string setId, string value); List<string> GetAllItemsFromSortedSet(string setId); List<string> GetAllItemsFromSortedSetDesc(string setId); List<string> GetRangeFromSortedSet(string setId, int fromRank, int toRank); List<string> GetRangeFromSortedSetDesc(string setId, int fromRank, int toRank); IDictionary<string, double> GetAllWithScoresFromSortedSet(string setId); IDictionary<string, double> GetRangeWithScoresFromSortedSet(string setId, int fromRank, int toRank); IDictionary<string, double> GetRangeWithScoresFromSortedSetDesc(string setId, int fromRank, int toRank); List<string> GetRangeFromSortedSetByLowestScore(string setId, string fromStringScore, string toStringScore); List<string> GetRangeFromSortedSetByLowestScore(string setId, string fromStringScore, string toStringScore, int? skip, int? take); List<string> GetRangeFromSortedSetByLowestScore(string setId, double fromScore, double toScore); List<string> GetRangeFromSortedSetByLowestScore(string setId, long fromScore, long toScore); List<string> GetRangeFromSortedSetByLowestScore(string setId, double fromScore, double toScore, int? skip, int? take); List<string> GetRangeFromSortedSetByLowestScore(string setId, long fromScore, long toScore, int? skip, int? take); IDictionary<string, double> GetRangeWithScoresFromSortedSetByLowestScore(string setId, string fromStringScore, string toStringScore); IDictionary<string, double> GetRangeWithScoresFromSortedSetByLowestScore(string setId, string fromStringScore, string toStringScore, int? skip, int? take); IDictionary<string, double> GetRangeWithScoresFromSortedSetByLowestScore(string setId, double fromScore, double toScore); IDictionary<string, double> GetRangeWithScoresFromSortedSetByLowestScore(string setId, long fromScore, long toScore); IDictionary<string, double> GetRangeWithScoresFromSortedSetByLowestScore(string setId, double fromScore, double toScore, int? skip, int? take); IDictionary<string, double> GetRangeWithScoresFromSortedSetByLowestScore(string setId, long fromScore, long toScore, int? skip, int? take); List<string> GetRangeFromSortedSetByHighestScore(string setId, string fromStringScore, string toStringScore); List<string> GetRangeFromSortedSetByHighestScore(string setId, string fromStringScore, string toStringScore, int? skip, int? take); List<string> GetRangeFromSortedSetByHighestScore(string setId, double fromScore, double toScore); List<string> GetRangeFromSortedSetByHighestScore(string setId, long fromScore, long toScore); List<string> GetRangeFromSortedSetByHighestScore(string setId, double fromScore, double toScore, int? skip, int? take); List<string> GetRangeFromSortedSetByHighestScore(string setId, long fromScore, long toScore, int? skip, int? take); IDictionary<string, double> GetRangeWithScoresFromSortedSetByHighestScore(string setId, string fromStringScore, string toStringScore); IDictionary<string, double> GetRangeWithScoresFromSortedSetByHighestScore(string setId, string fromStringScore, string toStringScore, int? skip, int? take); IDictionary<string, double> GetRangeWithScoresFromSortedSetByHighestScore(string setId, double fromScore, double toScore); IDictionary<string, double> GetRangeWithScoresFromSortedSetByHighestScore(string setId, long fromScore, long toScore); IDictionary<string, double> GetRangeWithScoresFromSortedSetByHighestScore(string setId, double fromScore, double toScore, int? skip, int? take); IDictionary<string, double> GetRangeWithScoresFromSortedSetByHighestScore(string setId, long fromScore, long toScore, int? skip, int? take); long RemoveRangeFromSortedSet(string setId, int minRank, int maxRank); long RemoveRangeFromSortedSetByScore(string setId, double fromScore, double toScore); long RemoveRangeFromSortedSetByScore(string setId, long fromScore, long toScore); long GetSortedSetCount(string setId); long GetSortedSetCount(string setId, string fromStringScore, string toStringScore); long GetSortedSetCount(string setId, long fromScore, long toScore); long GetSortedSetCount(string setId, double fromScore, double toScore); double GetItemScoreInSortedSet(string setId, string value); long StoreIntersectFromSortedSets(string intoSetId, params string[] setIds); long StoreUnionFromSortedSets(string intoSetId, params string[] setIds); List<string> SearchSortedSet(string setId, string start = null, string end = null, int? skip = null, int? take = null); long SearchSortedSetCount(string setId, string start = null, string end = null); long RemoveRangeFromSortedSetBySearch(string setId, string start = null, string end = null); #endregion #region Hash operations bool HashContainsEntry(string hashId, string key); bool SetEntryInHash(string hashId, string key, string value); bool SetEntryInHashIfNotExists(string hashId, string key, string value); void SetRangeInHash(string hashId, IEnumerable<KeyValuePair<string, string>> keyValuePairs); long IncrementValueInHash(string hashId, string key, int incrementBy); double IncrementValueInHash(string hashId, string key, double incrementBy); string GetValueFromHash(string hashId, string key); List<string> GetValuesFromHash(string hashId, params string[] keys); bool RemoveEntryFromHash(string hashId, string key); long GetHashCount(string hashId); List<string> GetHashKeys(string hashId); List<string> GetHashValues(string hashId); Dictionary<string, string> GetAllEntriesFromHash(string hashId); #endregion #region Eval/Lua operations string ExecLuaAsString(string luaBody, params string[] args); string ExecLuaAsString(string luaBody, string[] keys, string[] args); string ExecLuaShaAsString(string sha1, params string[] args); string ExecLuaShaAsString(string sha1, string[] keys, string[] args); long ExecLuaAsInt(string luaBody, params string[] args); long ExecLuaAsInt(string luaBody, string[] keys, string[] args); long ExecLuaShaAsInt(string sha1, params string[] args); long ExecLuaShaAsInt(string sha1, string[] keys, string[] args); List<string> ExecLuaAsList(string luaBody, params string[] args); List<string> ExecLuaAsList(string luaBody, string[] keys, string[] args); List<string> ExecLuaShaAsList(string sha1, params string[] args); List<string> ExecLuaShaAsList(string sha1, string[] keys, string[] args); string CalculateSha1(string luaBody); bool HasLuaScript(string sha1Ref); Dictionary<string, bool> WhichLuaScriptsExists(params string[] sha1Refs); void RemoveAllLuaScripts(); void KillRunningLuaScript(); string LoadLuaScript(string body); #endregion } ``` Generally, if you only have basic persistence needs I would recommend developing against the above common data access API as it is easier for other persistence providers to implement and increases the likely-hood that your existing library can be reused as-is to have your POCO types persist against other data stores i.e. against an RDBMS with OrmLite, etc. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/clients-overview.md --- title: Service Clients Overview --- As ServiceStack Services are pure HTTP APIs they're accessible with any HTTP-capable client, but they're also capable of native client integrations with popular languages used to create Web, Mobile and Desktop Apps for maximum productivity and correctness. The developer workflow is further simplified with IDE plugins that let you generate native client DTOs directly from your favorite IDEs: <section class="text-center"> <div class="container"> <div class="mx-auto max-w-md px-4 text-center sm:max-w-3xl sm:px-6 lg:max-w-7xl lg:px-8"> <h2 class="text-base font-semibold uppercase tracking-wider text-indigo-600">Develop faster</h2> <p class="mt-2 text-3xl font-extrabold tracking-tight text-gray-900 dark:text-gray-50 sm:text-4xl">Right Click, Integrate</p> <p class="mx-auto mt-5 max-w-prose text-xl text-gray-500"> Native client integrations for your APIs in all popular languages and IDEs </p> </div> <div class="flex flex-wrap"> <div class="w-full lg:w-1/4 mt-4"> <a href="https://marketplace.visualstudio.com/items?itemName=Mythz.ServiceStackVS"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/vs-2019.svg" class="w-20 h-20"> </div> <h3>Visual Studio</h3> </a> <h4>Languages Supported</h4> <p class="italic"> C#, F#, TypeScript, VB.NET </p> </div> <div class="w-full lg:w-1/4 mt-4"> <a href="https://plugins.jetbrains.com/plugin/17295-servicestack"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/ides/icon-rider.svg" class="w-20 h-20"> </div> <h3>Rider</h3> </a> <h4>Languages Supported</h4> <p class="italic"> C#, F#, TypeScript, VB.NET </p> </div> <div class="w-full lg:w-1/4 mt-4"> <a href="https://plugins.jetbrains.com/plugin/7749-servicestack"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/ides/icon-intellij-idea.svg" class="w-20 h-20"> </div> <h3>IntelliJ</h3> </a> <h4>Languages Supported</h4> <p class="italic"> Java, Kotlin, TypeScript </p> </div> <div class="w-full lg:w-1/4 mt-4"> <a href="https://plugins.jetbrains.com/plugin/7749-servicestack"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/ides/icon-webstorm.svg" class="w-20 h-20"> </div> <h3>WebStorm</h3> </a> <h4>Languages Supported</h4> <p class="italic"> TypeScript </p> </div> <div class="w-full lg:w-1/4 mt-4"> <a href="https://plugins.jetbrains.com/plugin/7749-servicestack"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/ides/icon-pycharm.svg" class="w-20 h-20"> </div> <h3>PyCharm</h3> </a> <h4>Languages Supported</h4> <p class="italic"> Python, TypeScript </p> </div> <div class="w-full lg:w-1/4 mt-4"> <a href="https://plugins.jetbrains.com/plugin/7749-servicestack"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/ides/icon-rubymine.svg" class="w-20 h-20"> </div> <h3>RubyMine</h3> </a> <h4>Languages Supported</h4> <p class="italic"> TypeScript </p> </div> <div class="w-full lg:w-1/4 mt-4"> <a href="https://plugins.jetbrains.com/plugin/7749-servicestack"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/androidstudio.svg" class="w-20 h-20"> </div> <h3>Android Studio</h3> </a> <h4>Languages Supported</h4> <p class="italic"> Java, Kotlin, TypeScript </p> </div> <div class="w-full lg:w-1/4 mt-4"> <a href="https://marketplace.eclipse.org/content/servicestackeclipse"> <div class="inline-flex justify-center"> <img src="/img/pages/svg/ides/eclipse-11.svg" height="4em" alt="" style="width:70px;height:70px;"> </div> <h3>Eclipse</h3> </a> <h4>Languages Supported</h4> <p class="italic"> Java </p> </div> </div> </div> </section> Support for all languages are implemented the same way where the generated DTOs can be used with idiomatic [generic Service Clients](#servicestack-clients) giving developers a consistent way of creating and updating your DTOs regardless of your language of choice. ### Command-line Tooling The [x dotnet tool](/dotnet-tool) also allows us to generate these native service references from the command-line with the format `x <lang> <url>`, e.g. we can create C# DTOs for our App with: :::sh `x csharp https://localhost:5001` ::: Output: ``` Saved to: dtos.ts ``` Or create a TypeScript ServiceStack Reference with: :::sh `x typescript https://localhost:5001` ::: ### Updating a ServiceStack Reference To Update an existing ServiceStack Reference, call `x typescript` with the Filename, e.g: :::sh x typescript dtos.ts ::: Result: ``` Updated: dtos.ts ``` This will update the File with your App's latest TypeScript Server DTOs. DTO customizations are also available by uncommenting the [TypeScript DTO Customization Options](/typescript-add-servicestack-reference#dto-customization-options) and updating them again. #### Updating all DTOs Calling `x typescript` without any arguments will update **all TypeScript DTOs** in the current directory: :::sh x typescript ::: Other available languages include: <table class="table table-bordered"> <tr> <th>Script</th> <th>Alias</th> <th>Language</th> </tr> <tr> <td>x csharp</td> <td>x cs</td> <td>C#</td> </tr> <tr> <td>x typescript</td> <td>x ts</td> <td>TypeScript</td> </tr> <tr> <td>x python</td> <td>x py</td> <td>Python</td> </tr> <tr> <td>x java</td> <td></td> <td>Java</td> </tr> <tr> <td>x kotlin</td> <td>x kt</td> <td>Kotlin</td> </tr> <tr> <td>x swift</td> <td></td> <td>Swift</td> </tr> <tr> <td>x dart</td> <td></td> <td>Dart</td> </tr> <tr> <td>x vbnet</td> <td>x vb</td> <td>VB.NET</td> </tr> <tr> <td>x fsharp</td> <td>x fs</td> <td>F#</td> </tr> </table> ### ServiceStack Clients To enable its clean end-to-end typed API development model, the generated DTOs can be used with a generic Service Client available for each supported language: * [C#/.NET Client](/csharp-client) * [TypeScript Client](/typescript-add-servicestack-reference) * [Kotlin Client](/kotlin-add-servicestack-reference) * [Java Client](/java-add-servicestack-reference) * [Swift Client](/swift-add-servicestack-reference) * [Dart Client](/dart-add-servicestack-reference#example-usage) * [JavaScript Client](/javascript-client) * [MQ Clients](/redis-mq) ### Supported Languages This [Add ServiceStack Reference](/add-servicestack-reference) feature is available for all the popular supported languages below: <table class="table table-bordered w-full" style="text-align:center"> <tr> <td><a href="/csharp-add-servicestack-reference">C#</a></td> <td><a href="/typescript-add-servicestack-reference">TypeScript</a></td> <td><a href="/javascript-add-servicestack-reference">JavaScript</a></td> <td><a href="/python-add-servicestack-reference">Python</a></td> <td><a href="/swift-add-servicestack-reference">Swift</a></td> <td><a href="/java-add-servicestack-reference">Java</a></td> <td><a href="/kotlin-add-servicestack-reference">Kotlin</a></td> <td><a href="/dart-add-servicestack-reference">Dart</a></td> <td><a href="/fsharp-add-servicestack-reference">F#</a></td> <td><a href="/vbnet-add-servicestack-reference">VB.NET</a></td> </tr> </table> ![](./img/pages/add-ss-ref.svg) ## Development workflow preview Here's quick walkthrough's installing the **ServiceStack** plugin and using it to add remote ServiceStack References in a new C# App: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="JKsgrstNnYY" style="background-image: url('https://img.youtube.com/vi/JKsgrstNnYY/maxresdefault.jpg')"></lite-youtube> :::tip VSCode and other IDEs will be able to use the command-line tool for adding and updating multiple Services references. ::: ### C# Xamarin.Android Example in VS.NET Using C# to develop native Mobile and Desktop Apps provides a number of benefits including maximum reuse of your investments across multiple Client Apps where they're able to reuse shared functionality, libraries, knowledge, development workflow and environment in both Client and Server Apps. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="cbYuem1b2tg" style="background-image: url('https://img.youtube.com/vi/cbYuem1b2tg/maxresdefault.jpg')"></lite-youtube> ### Call ServiceStack APIs from a Flutter App with native Dart client and DTOs Walk through showing how you can use ServiceStack's Dart client library with your Flutter Android application to quickly get up and running with Add ServiceStack Reference. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="ocH5L-CikQ0" style="background-image: url('https://img.youtube.com/vi/ocH5L-CikQ0/maxresdefault.jpg')"></lite-youtube> ### Call ServiceStack APIs from Python This video tutorial looks at how we can leverage Add ServiceStack Reference for Python in PyCharm, VSCode and [Python Jupyter Notebooks](/jupyter-notebooks-python). <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="WjbhfH45i5k" style="background-image: url('https://img.youtube.com/vi/WjbhfH45i5k/maxresdefault.jpg')"></lite-youtube> ### Instant Client Apps [Instant Client Apps](https://apps.servicestack.net/) is a free tool to jump start your native client application development using a wide range of languages and platforms including: C#, NodeJS, Dart, Java, Kotlin, Swift, VB .NET and F#: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="GTnuMhvUayg" style="background-image: url('https://img.youtube.com/vi/GTnuMhvUayg/maxresdefault.jpg')"></lite-youtube> ## gRPC [ServiceStack gRPC](/grpc/) enables a highly productive development environment for developing high-performance gRPC HTTP/2 Services by making ServiceStack's existing typed Services available from ASP.NET's gRPC endpoints where ServiceStack offers a simplified development model for gRPC Clients for streamlined end-to-end productivity. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="UQlYodNS1xc" style="background-image: url('https://img.youtube.com/vi/UQlYodNS1xc/maxresdefault.jpg')"></lite-youtube> ## C# Mobile and Desktop Apps [![](https://raw.githubusercontent.com/ServiceStackApps/HelloMobile/master/screenshots/splash-900.png)](https://github.com/ServiceStackApps/HelloMobile) The generated DTOs provides a highly productive development workflow and enables a succinct end-to-end Typed API that can be used in both **.NET Framework** and **.NET Standard 2.0** [Generic Service Clients](/csharp-client) to facilitate Rapid Development in .NET's most popular Mobile and Desktop platforms: - WPF - UWP - Xamarin.Android - Xamarin.iOS - Xamarin.OSX - Xamarin.Forms - iOS - Android - UWP The [HelloMobile](https://github.com/ServiceStackApps/HelloMobile) project contains multiple versions of the same App in all the above platforms demonstrating a number of different calling conventions, service integrations and reuse possibilities. ServiceStack also allows for the maximum reuse possible by letting you reuse the same POCO DTOs used to define the Services contract with, in Clients Apps to provide its end-to-end typed API without any additional custom build tools, code-gen or any other artificial machinery, using just the DTOs in the shared `ServiceModel.dll` with any of the available highly performant [.NET generic Service Clients](/csharp-client) that be design encourages development of [resilient message-based Services](/what-is-a-message-based-web-service) for enabling [highly decoupled](/service-gateway) and easily [substitutable and mockable](/csharp-client#built-in-clients) Service Integrations. ### .NET Clients Message-based API There are multiple C# service clients included, each optimized for their respective formats: ![ServiceStack HTTP Client Architecture](/img/pages/overview/servicestack-httpclients.png) - [JSON Client](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/JsonServiceClient.cs) - [XML Client](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/XmlServiceClient.cs) - [JSV Client](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/JsvServiceClient.cs) - [SOAP 1.1/1.2 Clients](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/Soap12ServiceClient.cs) - [ProtoBuf Client](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.ProtoBuf/ProtoBufServiceClient.cs) All clients share the same [IServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/IServiceClient.cs) and [IServiceClientAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/IServiceClientAsync.cs) so they're easily swappable at runtime, and is what allows the same Unit test to be re-used as within an [Xml, JSON, JSV, SOAP Integration test](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.IntegrationTests/Tests/WebServicesTests.cs). The JSON, XML and JSV clients also share [IRestClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/IRestClient.cs) and [IRestClientAsync](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/IRestClientAsync.cs) ## What's the best way to expose our services to clients today? ### Native language client libraries A productive option for clients (and the recommended approach by ServiceStack) would be to provide a native client library for each of the popular languages you wish to support. This is the approach of companies who really, really want to help you use their services like Amazon, Facebook and Windows Azure. This is an especially good idea if you want to support static languages (i.e. C# and Java) where having typed client libraries saves end-users from reverse engineering the types and API calls. It also saves them having to look up documentation since a lot of it can be inferred from the type info. ServiceStack's and Amazons convention of having `ServiceName` and `ServiceNameResponse` for each service also saves users from continually checking documentation to work out what the response of each service will be. ### Packaging client libraries In terms of packaging your client libraries, sticking a link to a zip file on your Websites APIs documentation page would be the easiest approach. If the zip file was a link to a master archive of a Github repository, that would be better as you'll be able to accept bug fixes and usability tips from the community. Finally we believe the best way to make your client libraries available would be to host them in the target languages native package manager - letting end-users issue 1-command to automatically add it to their project, and another to easily update it when your service has changed. ### Using NuGet For .NET this means adding it to NuGet, and if you use ServiceStack your package would just need to contain your types with a reference to [ServiceStack.Client](http://nuget.org/packages/ServiceStack.Client). One of the benefits of using ServiceStack is that all your types are already created since it's what you used to define your web services with! # Community Resources - [Servicestack and PHP](http://www.majorsilence.com/servicestack_and_php) by [@majorsilence](https://github.com/majorsilence) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/locode/code-first.md --- title: Locode - Code-First --- Code-First is the natural development model of [AutoQuery Services](/autoquery/rdbms) which facilitates the majority of a System and its UI can be developed from simple, declarative C# POCOs to define the underlying RDBMS Schema Data Models and the precise typed API DTO Contracts of their surrounding AutoQuery & CRUD APIs. The Data and Service models can be further enhanced by ServiceStack's vast [declarative attributes](/locode/declarative) where a significant amount of behavior, functionality and customization can be defined, ranging from: - Customizing how [Data Models map to RDBMS tables](/locode/declarative.html#table-data-model-attributes) and enlist RDBMS features - [Customize Serialization & API behavior](/locode/declarative.html#custom-serialization) - [Define AutoQuery & CRUD API behavior](/locode/declarative.html#autoquery-attributes) - Define [Validation Rules](/locode/declarative.html#type-validation-attributes) and [Authorization restrictions](/locode/declarative.html#authentication-restrictions) - [Annotate & Document APIs](/locode/declarative.html#annotate-apis) - [Customize UI Behavior & Appearance](/locode/declarative.html#result-formatters) To get started quickly, we've created a video containing a step-by-step walkthrough of creating a Code-First CRUD App with Locode: <div class="py-8 max-w-7xl mx-auto px-4 sm:px-6"> <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="mFyMgg7c3vg" style="background-image: url('https://img.youtube.com/vi/mFyMgg7c3vg/maxresdefault.jpg')"></lite-youtube> </div> ## Bookings MVP A simple example of using Locode for a back office bookings system would be a single table that a staff member populates. ### Create your project To start off, we will create a project from the basic `web` template using the ServiceStack website. The link below will create a new project with the name "BookingsLocode". <div class="not-prose flex"> <a href="https://account.servicestack.net/archive/NetCoreTemplates/web?Name=BookingsLocode" class="text-xl hover:no-underline"> <div class="bg-white dark:bg-gray-800 px-4 py-4 mr-4 mb-4 rounded-lg shadow-lg text-center items-center justify-center hover:shadow-2xl dark:border-2 dark:border-pink-600 dark:hover:border-blue-600"> <div class="text-center font-extrabold flex items-center justify-center mb-2"> <div class="text-4xl text-blue-600 my-3"> <svg xmlns="http://www.w3.org/2000/svg" class="w-12 h-12" viewBox="0 0 24 24"><path fill="currentColor" d="M4 10.4V4a1 1 0 0 1 1-1h5V1h4v2h5a1 1 0 0 1 1 1v6.4l1.086.326a1 1 0 0 1 .682 1.2l-1.516 6.068A4.992 4.992 0 0 1 16 16a4.992 4.992 0 0 1-4 2a4.992 4.992 0 0 1-4-2a4.992 4.992 0 0 1-4.252 1.994l-1.516-6.068a1 1 0 0 1 .682-1.2L4 10.4zm2-.6L12 8l2.754.826l1.809.543L18 9.8V5H6v4.8zM4 20a5.978 5.978 0 0 0 4-1.528A5.978 5.978 0 0 0 12 20a5.978 5.978 0 0 0 4-1.528A5.978 5.978 0 0 0 20 20h2v2h-2a7.963 7.963 0 0 1-4-1.07A7.963 7.963 0 0 1 12 22a7.963 7.963 0 0 1-4-1.07A7.963 7.963 0 0 1 4 22H2v-2h2z"/></svg> </div> </div> <span class="archive-name px-4 pb-2 text-blue-600 dark:text-indigo-400">BookingsLocode.zip</span> <div class="count mt-1 text-gray-400 text-sm"></div> </div> </a> </div> The `web` template for a ServiceStack application will provide the basic solution structure with a sample Hello World service. This can be done using the [ServiceStack website](https://servicestack.net) under [Get Started](https://servicestack.net/start). Alternatively, templates can be created using the dotnet CLI tool `x`. The dotnet `x` tool can be installed using the following command: :::sh dotnet tool install --global x ::: Once installed, a new `web` template can be created using: :::sh npx create-net web MyProjectName ::: ### Mix in a database and AutoQuery We can use the dotnet `x` tool to `mix` in specific database support and AutoQuery quickly using the command run from the project directory. :::sh npx add-in sqlite autoquery ::: ::: tip Replace `sqlite` with `postgres`, `sqlserver`, or `mysql` or other RDBMS providers ::: This command will create two files, `Configure.Db.cs` and `Configure.AutoQuery.cs` and install required NuGet dependencies into the AppHost (BookingsLocode in the link above) project. ### Bookings table With our App now setup to use SQLite & AutoQuery, we'll define our `Booking` table where our data will be stored in: ```csharp public class Booking { [AutoIncrement] public int Id { get; set; } public string Name { get; set; } public RoomType RoomType { get; set; } public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } public decimal Cost { get; set; } public string Notes { get; set; } public bool? Cancelled { get; set; } } public enum RoomType { Single, Double, Queen, Twin, Suite, } ``` With our table schema defined in code, we can use OrmLite to create the table for us if it doesn't already exist, which we do in the mix generated `Configure.Db.cs` where our SQLite connection is defined, using `CreateTableIfNotExists()` to create the `Booking` table and populate it with Seed data when it's first created: ```csharp public class ConfigureDb : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddSingleton<IDbConnectionFactory>(new OrmLiteConnectionFactory( context.Configuration.GetConnectionString("DefaultConnection") ?? ":memory:", SqliteDialect.Provider)); }) // Create non-existing Table and add Seed Data Example .ConfigureAppHost(appHost => { using var db = appHost.Resolve<IDbConnectionFactory>().Open(); if (db.CreateTableIfNotExists<Booking>()) { // Seed data db.Insert(new Booking { Name = "Test", Cost = 123, RoomNumber = 321, RoomType = RoomType.Queen, Notes = "Testing more", BookingStartDate = new DateTime(2022, 1, 1), BookingEndDate = new DateTime(2022, 1, 5) }); } }); } ``` This configures our App's database ready for use, but we still don't have any AutoQuery APIs using them defined. ## AutoQuery APIs To create an [AutoQuery API](/autoquery/rdbms) to query our `Booking` RDBMS table, our Request DTO just needs to inherit `QueryDb<Table>` with `Booking` table they want the API to query: ```csharp public class QueryBookings : QueryDb<Booking> {} ``` This empty Request DTO alone is all it takes to create an AutoQuery API that can query each `Booking` column using any of the [Implicit Conventions](/autoquery/rdbms#implicit-conventions) registered in the `AutoQueryFeature` plugin, e.g: - https://vue-spa.web-templates.io/api/QueryBookings?Ids=1,2,3 However, to aid in the discovery of popular Booking table queries and make them easily accessible to all of ServiceStack's [Typed Service Clients](/add-servicestack-reference) or [gRPC Clients](/grpc/) it's recommended to formalize queries you want to make available by adding typed properties to the Request DTO, e.g: ```csharp public class QueryBookings : QueryDb<Booking> { public int[] Ids { get; set; } //... } ``` Where they can also be consumed by every Service Client with an end-to-end Typed API, e.g: ```csharp // C# var client = new JsonApiClient("https://vue-spa.web-templates.io"); var api = await client.ApiAsync(new QueryBookings { Ids = new[] { 1,2,3 })); ``` TypeScript Example: ```ts // TypeScript let client = new JsonServiceClient("https://vue-spa.web-templates.io") let api = await client.api(new QueryBookings({ Ids: [1,2,3] })) ``` ### User-defined Routes As AutoQuery APIs are themselves normal ServiceStack APIs they benefit from the entire customizability and ecosystem available to ServiceStack APIs, like [Routing](/routing) where the API can be made available under custom user-defined using the `[Route]` attribute: ```csharp [Route("/bookings")] public class QueryBookings : QueryDb<Booking> { public int[] Ids { get; set; } } ``` To also make the `QueryBookings` API available from the `/bookings` path, e.g: - https://vue-spa.web-templates.io/bookings?Ids=1,2,3 ### AutoQuery CRUD APIs To enable Auto CRUD behavior on your Table your Request DTOs can implement any of the following interfaces to create APIs with its respective CRUD behavior: - `ICreateDb<Table>` - Insert a new Table row - `IUpdateDb<Table>` - Fully Update an existing Table row - `IPatchDb<Table>` - Partially update an existing Table row - `IDeleteDb<Table>` - Delete an existing Table row The Create and Update Request DTOs properties define which columns are updatable from the API: ```csharp public class CreateBooking : ICreateDb<Booking>, IReturn<IdResponse> { public string Name { get; set; } public RoomType RoomType { get; set; } public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } public decimal Cost { get; set; } public string Notes { get; set; } } public class UpdateBooking : IPatchDb<Booking>, IReturn<IdResponse> { public int Id { get; set; } public string Name { get; set; } public RoomType? RoomType { get; set; } public int? RoomNumber { get; set; } public DateTime? BookingStartDate { get; set; } public DateTime? BookingEndDate { get; set; } public decimal? Cost { get; set; } public bool? Cancelled { get; set; } public string Notes { get; set; } } ``` Only a single Update DTO needs to be implemented to enable Update functionality in Locode. If `IPatchDb<Table>` is implemented will use it to only update modified fields, whereas if only `IUpdateDb<Table>` is implemented, Locode needs to send all fields to perform a full update. If you have [AutoQuery CRUD Events](/autoquery/audit-log) enabled it's recommended to use `IPatchDb<Table>` for the audit logs to only capture which fields were updated. To enable delete functionality in Locode create a Request DTO that implements `IDeleteDb<Table>` with the primary key of the table: ```csharp public class DeleteBooking : IDeleteDb<Booking>, IReturnVoid { public int Id { get; set; } } ``` Although not used by Locode, Delete APIs supports the same querying behavior as AutoQuery APIs where you could enable create an API that supports multiple and batch deletes with the fields you want to delete by, e.g: ```csharp public class DeleteBookings : IDeleteDb<Booking>, IReturnVoid { public int[]? Ids { get; set; } public bool? Cancelled { get; set; } } ``` Locode is a capability-based UI that only enables functionality for CRUD APIs that exist and the currently authenticated user has access to. As these public APIs don't have any auth restrictions applied to them, they can be used immediately by non-authenticated users without signing in to query, insert, update and delete from the `Booking` Table: ![](/img/pages/locode/code-first-bookings-mvp.png) Clicking on our `Booking` services on the left-hand menu utilizes the `QueryBooking` AutoQuery API, we can see the test seed data that was populated. ![](/img/pages/locode/code-first-bookings-mvp-2.png) Using the **New Booking** button gives us a metadata driven Form IO derived from the properties of the `CreateBooking` Request DTO: ![](/img/pages/locode/code-first-bookings-mvp-3.png) This form also allows editing existing bookings using the Edit button in the first column given its functionality is enabled with the application having the `IPatch<Booking>` API defined. ![](/img/pages/locode/code-first-bookings-mvp-4.png) ## POCO References When it can be inferred Locode automatically detects and linkify [POCO references](/ormlite/reference-support) for easy navigation which is used a lot in https://talent.locode.dev like navigating to a Job's [Job Applications](https://talent.locode.dev/locode/QueryJobApplication): [![](/img/pages/locode/talent/job-application-references.png)](https://talent.locode.dev/locode/QueryJobApplication) defined by its [POCO References](https://github.com/NetCoreApps/TalentBlazor/blob/ff6fd961f49141e617fef37b85240af04295359a/TalentBlazor.ServiceModel/Talent.cs#L87): ```csharp public class JobApplication : AuditBase { [AutoIncrement] public int Id { get; set; } [References(typeof(Job))] public int JobId { get; set; } [References(typeof(Contact))] public int ContactId { get; set; } [Reference] public Contact Applicant { get; set; } //... } ``` ### Navigating Child References The references support also allows adding related records by navigating to the child relation then adding the child record where it will preserve and pre-populate the parent id as seen when navigating to a Job's Applications: ```csharp public class Job : AuditBase { //... public List<JobApplication> Applications { get; set; } = new(); } ``` Where it will pre-populate the **Job Id** reference making it easy to add multiple **1:Many** Job Applications: ![](/img/pages/locode/talent/job-job-application-reference.png) Checkout [Talent.cs](https://github.com/NetCoreApps/TalentBlazor/blob/main/TalentBlazor.ServiceModel/Talent.cs) DTOs for more Reference examples. ## Customizing Locode App We've walked through a simple example of how to create CRUD APIs for our `Booking` RDBMS table which Locode uses to power its instant CRUD UI letting your App users start managing its data immediately. This touches on some basic functionality to get started in Locode, next we'll explore its declarative dev model with the different ways you can annotate your services and data model to customize its behavior & appearance and enhance it with additional functionality using the available composable built-in [declarative C# attributes](/locode/declarative). --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/vue/combobox.md --- title: Combobox Component group: Component Gallery --- The `Combobox` component provides an Autocomplete Input optimized for searching a List of string values, Key Value Pairs or Object Dictionary, e.g: ```html <div class="grid grid-cols-12 gap-6"> <Combobox id="Strings" class="col-span-4" v-model="strings" :values="['Alpha','Bravo','Charlie']" /> <Combobox id="Object" class="col-span-4" v-model="objects" :options="{ A:'Alpha', B:'Bravo', C:'Charlie' }" /> <Combobox id="Pairs" class="col-span-4" v-model="pairs" label="Multiple from Pairs" multiple :entries="[{key:'A',value:'Alpha'}, {key:'B',value:'Bravo'}, {key:'C',value:'Charlie'}]" /> </div> ``` <div class="not-prose grid grid-cols-12 gap-6"> <combobox id="Strings" class="col-span-4" v-model="strings" :values="['Alpha','Bravo','Charlie']"></combobox> <combobox id="Object" class="col-span-4" v-model="objects" :options="{ A:'Alpha', B:'Bravo', C:'Charlie' }"></combobox> <combobox id="Pairs" class="col-span-4" v-model="pairs" label="Multiple from Pairs" multiple :entries="[{key:'A',value:'Alpha'}, {key:'B',value:'Bravo'}, {key:'C',value:'Charlie'}]"></combobox> </div> Which supports populating both a single string value or multiple strings in an Array with **multiple** property. <api-reference component="Combobox"></api-reference> ## Auto Forms Combobox components can also be used in [Auto Form Components](/vue/autoform) on `string` or string collection properties with the `[Input(Type="combobox")]` [declarative UI Attribute](/locode/declarative#ui-metadata-attributes) on C# Request DTOs, e.g: ```csharp public class ComboBoxExamples : IReturn<ComboBoxExamples>, IPost { [Input(Type="combobox", Options = "{ allowableValues:['Alpha','Bravo','Charlie'] }")] public string? SingleClientValues { get; set; } [Input(Type="combobox", Options = "{ allowableValues:['Alpha','Bravo','Charlie'] }", Multiple = true)] public List<string>? MultipleClientValues { get; set; } [Input(Type="combobox", EvalAllowableValues = "['Alpha','Bravo','Charlie']")] public string? SingleServerValues { get; set; } [Input(Type="combobox", EvalAllowableValues = "AppData.AlphaValues", Multiple = true)] public List<string>? MultipleServerValues { get; set; } [Input(Type="combobox", EvalAllowableEntries = "{ A:'Alpha', B:'Bravo', C:'Charlie' }")] public string? SingleServerEntries { get; set; } [Input(Type="combobox", EvalAllowableEntries = "AppData.AlphaDictionary", Multiple = true)] public List<string>? MultipleServerEntries { get; set; } } ``` Which can then be rendered with: ```html <AutoForm type="ComboBoxExamples" /> ``` <auto-form type="ComboBoxExamples" class="not-prose mb-4"></auto-form> **Combobox Options** Each property shows a different way of populating the Combobox's optional values, they can be populated from a JavaScript Object literal using `Options` or on the server with a [#Script Expression](https://sharpscript.net) where they can be populated from a static list or from a C# class as seen in the examples referencing `AppData` properties: ```csharp public class AppData { public List<string> AlphaValues { get; set; } public Dictionary<string, string> AlphaDictionary { get; set; } public List<KeyValuePair<string, string>> AlphaKeyValuePairs { get; set; } } ``` Which are populated on in the AppHost on Startup with: ```csharp ScriptContext.Args[nameof(AppData)] = new AppData { AlphaValues = new() { "Alpha", "Bravo", "Charlie" }, AlphaDictionary = new() { ["A"] = "Alpha", ["B"] = "Bravo", ["C"] = "Charlie", }, AlphaKeyValuePairs = new() { new("A","Alpha"), new("B","Bravo"), new("C","Charlie"), }, }; ``` Which can alternatively be populated from a dynamic source like an RDBMS table. As C# Dictionaries have an undetermined sort order, you can use a `List<KeyValuePair<string, string>>` instead when you need to display an ordered list of Key/Value pairs. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ai-server/comfy-extension.md --- title: ComfyUI Agent description: Installing and configuring the ComfyUI Agent for AI Server --- ComfyUI is a powerful workflow tool for various AI related tasks including the ability to create images from text, images from images, and more. It is a key component of AI Server that provides a wide range of processing capabilities. One issue it has though is that it can be difficult to integrate with other systems. The ComfyUI API consists of converting a JSON workflow definition to an API format with very specific requirements. As a way to leverage the ComfyUI API in a more accessible manner, we have created a [ComfyUI Agent](https://github.com/serviceStack/agent-comfy) repository so you can more easily use ComfyUI workflows and add it to be a provider in AI Server. This allows you to integrate a ComfyUI Agent into your AI Server instance using it as a remote self-hosted agent capable of processing image requests, and other modalities. Since a lot of AI workloads require GPUs or other specialized hardware, the ComfyUI Agent can be run on a separate machine with the necessary hardware, and AI Server can be configured to use it as a provider for these kinds of tasks. ## Installing the ComfyUI Agent To install this more easily, you can use the `install.sh` script in the ComfyUI Agent repository. This works the same way as the AI Server installer, and will prompt you for the necessary configuration options. This installer supports both local and remote installations, and will ask you for the necessary configuration options including the Auth Secret for your AI Server instance. The install process will then register the ComfyUI Agent with your AI Server instance, enabling it for the model selections you make during the installation. ```sh git clone https://github.com/ServiceStack/agent-comfy.git cd agent-comfy cat install.sh | bash ``` This process will also persist the configuration in the `.env` file in the ComfyUI Agent directory, so you can easily restart the ComfyUI Agent with the same configuration. <ascii-cinema src="/pages/ai-server/agent-comfy-install.cast" loop="true" poster="npt:00:09" theme="dracula" rows="13" /> :::info On the first run, the ComfyUI Agent will download the models you selected during the installation process. This can take some time depending on the size of the models and your internet connection speed. ::: ### .env Configuration The `.env` file is used to configure the ComfyUI Agent during the initial setup, and is the easiest way to get started. ```sh DEFAULT_MODELS=sdxl-lightning,text-to-speech,speech-to-text,image-upscale-2x,image-to-text HF_TOKEN=your_huggingface_token AGENT_PASSWORD=password-to-restrict-access-to-agent ``` ::: info Models requiring authentication to download are also flagged in the `/lib/data/media-models.json` file of AI Server GitHub repository. ::: ### Accessing the ComfyUI Agent Once the ComfyUI Agent is running, you can access the ComfyUI Agent instance at [http://localhost:7860](http://localhost:7860) and can be used as a standard ComfyUI. The AI Server has pre-defined workflows to interact with your ComfyUI Agent instance to generate images, audio, text, and more. ### Overriding Workflows These workflows are found in the AI Server AppHost project under `workflows`. These are templated JSON versions of workflows you save in the ComfyUI web interface. You can override these workflows by creating a new JSON file with the same name and path but in the `App_Data/overrides` folder. E.g. to override the `text_to_image` workflow, you would create a file at `App_Data/overrides/text_to_image.json` This would override all calls that use text-to-image workflow sent to your ComfyUI Agent instance. You can also override just `flux-schnell` by creating a file at `App_Data/overrides/flux1/text_to_image.json` and Stable Diffusion 3.5 at `App_Data/overrides/sd35/text_to_image.json`. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/commands.md --- title: Commands Feature --- ### Utilizing Commands to build more robust and observable systems Much of ServiceStack has been focused on providing a productive [API First Development](/api-first-development) experience and adding value-added features around your System's external APIs. ### Internal API Implementation Thus far little attention has been given to internal implementations of APIs since it can use anything that fulfils its service contract by returning the APIs populated Response DTO. How code-bases are structured is largely a matter of developer preference, however we believe we've also been able to add value in this area with the new appealing managed **Commands Feature**. :::youtube SXPdBHbncPc Use Commands to build robust and observable systems with Admin UI ::: ## Code Architecture Ultimately nothing beats the simplicity of "No Architecture" by maintaining all logic within a Service Implementation which just needs to call a few App dependencies to implement its functionality and return a populated Response DTO: ```csharp public object Any(MyRequest request) => new MyResponse { ... }; ``` This is still the best option for small implementations where the Service is the only consumer of the logic that should be run on the HTTP Worker Request Thread. #### When to restructure Times when you may want to consider moving logic out of your Service into separate classes include: - **Code Reuse**: Make it easier to reuse your Service logic in other Services - **Complexity**: Break down complex logic into smaller more manageable pieces - **Testability**: Make it easier to test your Logic in isolation - **Observability**: Make it easier to log and monitor - **Robustness**: Make it easier to handle, retry and recover from errors - **Flexibility**: Make it easier to run in parallel or in a different managed thread We'll look at how the new **Commands Feature** can help in these areas. ### Code Reuse Following principles of YAGNI in doing the simplest thing that could possibly work, whenever we want to reuse logic across Services we'd first start by moving it to an extension method on the dependency that it uses, e.g. ```csharp public static async Task<List<Contact>> GetActiveSubscribersAsync( this IDbConnection db, MailingList mailingList) { return await db.SelectAsync(db.From<Contact>(db.TableAlias("c")) .Where(x => x.DeletedDate == null && x.UnsubscribedDate == null && x.VerifiedDate != null && (mailingList & x.MailingLists) == mailingList) .WhereNotExists(db.From<InvalidEmail>() .Where<Contact,InvalidEmail>((c,e) => e.EmailLower == Sql.TableAlias(c.EmailLower, "c")) .Select(x => x.Id)) ); } ``` Which does a great job at encapsulating logic and making it reusable and readable: ```csharp foreach (var sub in await Db.GetActiveSubscribersAsync(MailingList.Newsletter)) { //... } ``` Where it can be reused without referencing any external classes whilst also being easily discoverable via intelli-sense. This works great for 1 or 2 dependencies, but becomes more cumbersome as the number of dependencies grows, e.g: ```csharp public static async Task<List<Contact>> GetActiveSubscribersAsync( this IDbConnection db, ILogger log, ICacheClient cache, MailingList mailingList) ``` In which the complexity of the extension method dependencies leaks and impacts all calling classes that need to include them and also starts to impact its readability, e.g: ```csharp public class MyService(ILogger<MyService> log, ICacheClient cache, IDbConnection db) : Service { public object Any(MyRequest request) { var subs = await Db.GetActiveSubscribersAsync(log, cache, request.MailList); } } ``` ### Refactoring Logic into separate classes The solution to this is to refactor the logic into a separate class and leverage the IOC to inject the dependencies it needs, fortunately with Primary Constructors this now requires minimal boilerplate code, e.g: ```csharp class MyLogic(ILogger<MyService> log, ICacheClient cache, IDbConnection db) { //... } ``` But it still requires manual registration adding additional complexity to your Host project `Program.cs` or [Modular Configurations](/modular-startup) which needs to manage registration for all these new logic classes, e.g: ```csharp builder.Services.AddTransient<MyLogic>(); ``` ## Commands Feature Which touches on the first benefit of the **Commands Feature** which like ServiceStack Services auto registers all classes implementing the intentionally simple and impl-free `IAsyncCommand` interface, e.g: ```csharp public interface IAsyncCommand<in T> { Task ExecuteAsync(T request); } ``` Allowing for maximum flexibility in how to implement your logic classes, which are essentially encapsulated units of logic with a single method to execute it, e.g: ```csharp public class AddTodoCommand(ILogger<AddTodoCommand> log, IDbConnection db) : IAsyncCommand<CreateTodo> { public async Task ExecuteAsync(CreateTodo request) { var newTodo = request.ConvertTo<Todo>(); newTodo.Id = await db.InsertAsync(newTodo, selectIdentity:true); log.LogDebug("Created Todo {Id}: {Text}", newTodo.Id, newTodo.Text); } } ``` Where we immediately get the benefits of code reuse, encapsulation, and readability without needing to manually register and pollute your App's configuration with them. By default Commands are registered as transient dependencies, but you can also register them with a different lifetime scope using the `[Lifetime]` attribute, e.g: ```csharp [Lifetime(Lifetime.Scoped)] public class AddTodoCommand(ILogger<AddTodoCommand> log, IDbConnection db) : IAsyncCommand<CreateTodo> {} ``` Or by manually registering them, if you need a custom registration: ```csharp services.AddTransient<AddTodoCommand>(c => CreateAddTodoCommand(c)); ``` ### Commands with Results For maximum flexibility, we want to encourage temporal decoupling by separating initiating a command from its execution, so instead of adding a different method to execute commands with results, we're instead recommending the convention of storing the result of a command in a `Result` property, e.g: ```csharp public interface IAsyncCommand<in TRequest, out TResult> : IAsyncCommand<TRequest>, IHasResult<TResult> { } public interface IHasResult<out T> { T Result { get; } } ``` So we could implement a command with a result like: ```csharp public class AddTodoCommand(ILogger<AddTodoCommand> log, IDbConnection db) : IAsyncCommand<CreateTodo, Todo> { public Todo Result { get; private set; } public async Task ExecuteAsync(CreateTodo request) { Result = request.ConvertTo<Todo>(); Result.Id = await db.InsertAsync(newTodo, selectIdentity:true); log.LogDebug("Created Todo {Id}: {Text}", Result.Id, Result.Text); } } ``` ### Ergonomic Base Classes Often you'll also need to make additional Request Context available to the command that's not apart of the Command Request or registered from the IOC like an Authenticated User Context or `CancellationToken`. ::include command-types.md:: ## Messaging Workflow For greater resilience and scalability we recommend utilizing a messaging pattern to notify the outputs of a command by publishing messages to invoke dependent logic instead of returning a result, e.g: ### Background Jobs ```csharp public class AddTodoCommand(IDbConnection db, IBackgroundJobs jobs) : SyncCommand<MyArgs> { protected override void Run(MyArgs request) { var newTodo = request.ConvertTo<Todo>(); newTodo.Id = db.Insert(newTodo, selectIdentity:true); // Non Durable Example jobs.RunCommand<SendNotificationCommand>( new SendNotification { TodoCreated = newTodo }); // Durable Example jobs.EnqueueCommand<SendNotificationCommand>( new SendNotification { TodoCreated = newTodo }); } } ``` ### Background MQ ```csharp public class AddTodoCommand(IDbConnection db, IMessageProducer mq) : SyncCommand<MyArgs> { protected override void Run(MyArgs request) { var newTodo = request.ConvertTo<Todo>(); newTodo.Id = db.Insert(newTodo, selectIdentity:true); mq.Publish(new SendNotification { TodoCreated = newTodo }); } } ``` Which decouples the sender and receiver of the message, allowing it to finish without needing to wait and concern itself on how subsequent logic is processed, e.g. how to handle errors, whether to execute it in a different managed thread, in parallel, etc. Messaging encourages adopting a more reliable asynchronous one-way workflow instead of implementing logic serially where the sender is timely coupled to the successful execution of all subsequent logic before being able to complete, e.g: ```csharp await cmd.ExecuteAsync(createTodo); var newTodo = cmd.Result; await SendNewTodoNotificationAsync(newTodo); ``` It allows for more reliable and observable workflows that removes the temporal coupling between components where each execution step can be executed on different threads, independently monitored and retried if needed. ```txt [A] -> [B] -> [C] ``` ### Commands as Application Building Blocks As they're not dependent on any framework and can support multiple execution patterns, we believe Commands make great building blocks for insulating units of logic as they're simple and testable and allow for managed execution which can easily add logging, monitoring, and resilience around your logic. ### Background Jobs or MQ It should be noted adopting a messaging pattern doesn't require additional infrastructure complexity of an external MQ Server as you can use [Background Jobs](/background-jobs) or [Background MQ](/background-mq) to execute messages in managed background threads. ### Executing Commands Commands are effectively a pattern to structure your logic that doesn't depend on any implementation assembly or framework, so they can just be executed directly, e.g: ```csharp using var db = dbFactory.Open(); var cmd = new AddTodoCommand(new NullLogger<AddTodoCommand>(), db); await cmd.ExecuteAsync(new CreateTodo { Text = "New Todo" }); ``` ### Command Executor They also allow for a managed execution which the **CommandsFeature** provides with its `ICommandExecutor` which can be executed like: ```csharp public class MyService(ICommandExecutor executor) : Service { public object Any(MyRequest request) { var cmd = executor.Command<AddTodoCommand>(); await cmd.ExecuteAsync(new AddTodoCommand { Text = "New Todo" }); } } ``` This still results in the same behavior where exceptions are bubbled but also adds observability and resilience and other niceties like executing any Fluent or Declarative Validation on Command Requests. ### Retry Failed Commands We can make commands more resilient by adding the `[Retry]` attribute to opt into auto retrying failed commands: ```csharp [Retry] public class AddTodoCommand() : IAsyncCommand<CreateTodo> {} ``` Which will automatically retry the command as per the default Retry Policy: ```csharp services.AddPlugin(new CommandsFeature { DefaultRetryPolicy = new( Times: 3, Behavior: RetryBehavior.FullJitterBackoff, DelayMs: 100, MaxDelayMs: 60_000, DelayFirst: false ) }); ``` That can be overridden on a per-command basis with the `[Retry]` attribute, e.g: ```csharp [Retry(Times=4, MaxDelayMs=300_000, Behavior=RetryBehavior.LinearBackoff)] public class AddTodoCommand() : IAsyncCommand<CreateTodo> {} ``` The different Retry Behaviors available include: ```csharp public enum RetryBehavior { // Use the default retry behavior Default, // Always retry the operation after the same delay Standard, // Should be retried with a linear backoff delay strategy LinearBackoff, // Should be retried with an exponential backoff strategy ExponentialBackoff, // Should be retried with a full jittered exponential backoff strategy FullJitterBackoff, } ``` ## Command Admin UI Which can be inspected in the new **Command Admin UI** where you can view summary stats of all executed Commands and **APIs** in the **Summary** tab, e.g: [![](/img/pages/commands/AddTodoCommand-summary.png)](/img/pages/commands/AddTodoCommand-summary.png) ### Latest Command Executions It also maintains a rolling log of the latest executed commands in the **Latest** tab: [![](/img/pages/commands/AddTodoCommand-latest.png)](/img/pages/commands/AddTodoCommand-latest.png) ### Failed Command Executions Whilst the **Errors** tab shows a list of all failed **Command** and **API** executions: [![](/img/pages/commands/AddTodoCommand-errors.png)](/img/pages/commands/AddTodoCommand-errors.png) ### Execute Internal Commands A benefit of using Commands as the building block for your internal logic is that they enjoy many of the same benefits of ServiceStack's message-based Services where they can be invoked using just the Command **Name** and a **Request** Body which allows them to be discovered and executed from the **Explore** Tab: [![](/img/pages/commands/AddTodoCommand-execute.png)](/img/pages/commands/AddTodoCommand-execute.png) In this way they can be treated like **Internal APIs** for being able to invoke internal functionality that's only accessible by **Admin** Users. ### Group Commands by Tag Just like ServiceStack Services they can be grouped by **Tag** which can be used to group related commands: ```csharp [Tag("Todos")] public class AddTodoCommand() : IAsyncCommand<CreateTodo> {} ``` ## Execute Commands in Durable Background Jobs In addition to being able to execute **Commands** with the `ICommandExecutor` or from the UI, they can also be executed as part of a [Durable Background Job](/background-jobs) where you'll be able to track and monitor their progress in real-time. Background Jobs is already configured in all new [Identity Auth Templates](https://servicestack.net/start) in order to send all Identity Auth Emails. Whilst existing Projects can enable it in their .NET 10 Apps with: :::sh npx add-in jobs ::: Which adds a reference to the [ServiceStack.Jobs](https://www.nuget.org/packages/ServiceStack.Jobs) NuGet package and includes the [Modular Startup](/modular-startup) configuration below: ```csharp public class ConfigureBackgroundJobs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new CommandsFeature()); services.AddPlugin(new BackgroundsJobFeature()); services.AddHostedService<JobsHostedService>(); }).ConfigureAppHost(afterAppHostInit: appHost => { var services = appHost.GetApplicationServices(); var jobs = services.GetRequiredService<IBackgroundJobs>(); // Example of registering a Recurring Job to run Every Hour //jobs.RecurringCommand<MyCommand>(Schedule.Hourly); }); } public class JobsHostedService(ILogger<JobsHostedService> log, IBackgroundJobs jobs) : BackgroundService { protected override async Task ExecuteAsync(CancellationToken stoppingToken) { await jobs.StartAsync(stoppingToken); using var timer = new PeriodicTimer(TimeSpan.FromSeconds(3)); while (!stoppingToken.IsCancellationRequested && await timer.WaitForNextTickAsync(stoppingToken)) { await jobs.TickAsync(); } } } ``` ## Background MQ Integration Although `CommandsFeature` is a standalone feature it can also be configured and used along with [Background MQ](/background-mq) which is a good option if you intend on adopting another [Message Queue Broker](/messaging) in your App's in future. ```csharp public class ConfigureMq : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddSingleton<IMessageService>(c => new BackgroundMqService()); services.AddPlugin(new CommandsFeature()); }) .ConfigureAppHost(afterAppHostInit: appHost => { var mqService = appHost.Resolve<IMessageService>(); //Register ServiceStack APIs you want to be able to invoke via MQ mqService.RegisterHandler<SendEmail>(appHost.ExecuteMessage); mqService.Start(); }); } ``` Despite being 2 independent features, they work well together as the Background MQ can be used to execute Commands in managed background threads of which a single thread is used to execute each Request Type by default (configurable per request). You'd typically want to use queues to improve scalability by reducing locking and concurrency contention of heavy resources by having requests queued and executed in a managed background thread where it's able to execute requests as fast as it can without contention. Queues are also a great solution for working around single thread limitations of resources like writes to SQLite databases. ## Use Case - SQLite Writes As we've started to [use server-side SQLite databases](/ormlite/scalable-sqlite) for our new Apps given its [many benefits](/ormlite/litestream) we needed a solution to workaround its limitation of not being able to handle multiple writes concurrently. One of the benefits of using SQLite is creating and managing [multiple databases](/ormlite/scalable-sqlite#multiple-sqlite-databases) is relatively cheap, so we can mitigate this limitation somewhat by maintaining different subsystems in separate databases, e.g: [![](/img/pages/commands/pvq-databases.png)](/img/pages/commands/pvq-databases.png) But each database can only be written to by a single thread at a time, which we can now easily facilitate with **Background Jobs** or **MQ Command DTOs**. In all cases we recommend using [Sync DB APIs for SQLite](/ormlite/scalable-sqlite#always-use-synchronous-apis-for-sqlite) since their underlying implementation always blocks. ### Queuing DB Writes with SyncCommand Background Jobs One way to remove contention is to serially execute DB Writes which we can do by executing DB Writes within `SyncCommand*` and using a named `[Worker(Workers.AppDb)]` attribute for Writes to the primary database, e.g: ```csharp [Worker(Workers.AppDb)] public class DeleteCreativeCommand(IDbConnection db) : SyncCommand<DeleteCreative> { protected override void Run(DeleteCreative request) { var artifactIds = request.ArtifactIds; db.Delete<AlbumArtifact>(x => artifactIds.Contains(x.ArtifactId)); db.Delete<ArtifactReport>(x => artifactIds.Contains(x.ArtifactId)); db.Delete<ArtifactLike>(x => artifactIds.Contains(x.ArtifactId)); db.Delete<Artifact>(x => x.CreativeId == request.Id); db.Delete<CreativeArtist>(x => x.CreativeId == request.Id); db.Delete<CreativeModifier>(x => x.CreativeId == request.Id); db.Delete<Creative>(x => x.Id == request.Id); } } ``` Other databases should use its named connection for its named worker, e.g: ```csharp [Worker(Databases.Search)] public class DeleteSearchCommand(IDbConnectionFactory dbFactory) : SyncCommand<DeleteSearch> { protected override void Run(DeleteSearch request) { using var db = dbFactory.Open(Databases.Search); db.DeleteById<ArtifactFts>(request.Id); //... } } ``` Example of a DB Write command with result: ```csharp [Worker(Databases.Albums)] public class CreateAlbumCommand(IDbConnectionFactory dbFactory) : SyncCommandWithResult<CreateAlbum,Album> { protected override Album Run(CreateAlbum request) { using var db = dbFactory.Open(Databases.Albums); var album = request.ConvertTo<Album>(); album.Id = db.Insert(album, selectIdentity:true); foreach (var artifact in request.Artifacts) { artifact.AlbumId = album.Id; db.Insert(artifact); } return album; } } ``` Where it will be executed within its Database Lock. ### Running Commands You'll typically want to run DB Write Commands with `RunCommand*` APIs which are a faster and lighter weight alternative then durable jobs which are persisted in the **jobs.db** before execution. Everytime commands are executed they'll be added to a ConcurrentQueue of the specified worker. Commands delegated to different named workers execute concurrently, whilst commands with the same worker are executed serially. When using any `SyncCommand*` base class, its execution still uses database locks but any contention is alleviated as they're executed serially by a single worker thread. ```csharp public class MyServices(IBackgroundJobs jobs) : Service { // Returns immediately with a reference to the Background Job public object Any(DeleteCreative request) { // Queues a durable job to execute the command with the AppDb Worker var jobRef = jobs.EnqueueCommand<DeleteCreativeCommand>(request); // Executes Command with Databases.Search worker jobs.EnqueueCommand<DeleteSearchCommand>(new DeleteSearch { Id = request.ArtifactId }); return jobRef; } // Returns after the command is executed with its result (if any) public async Task Any(CreateAlbum request) { // Executes a transient (i.e. non-durable) job with the named worker var album = await jobs.RunCommandAsync<CreateAlbumCommand>(request); return album; } } ``` ### MQ Command DTOs If using **Background MQ** we can use the `[Command]` attribute to be able to execute multiple commands on a single Request DTO Properties: ```csharp [Tag(Tag.Tasks)] [Restrict(RequestAttributes.MessageQueue), ExcludeMetadata] public class DbWrites : IGet, IReturn<EmptyResponse> { [Command<CreatePostVoteCommand>] public Vote? CreatePostVote { get; set; } [Command<CreateCommentVoteCommand>] public Vote? CreateCommentVote { get; set; } [Command<CreatePostCommand>] public Post? CreatePost { get; set; } [Command<UpdatePostCommand>] public Post? UpdatePost { get; set; } [Command<DeletePostsCommand>] public DeletePosts? DeletePosts { get; set; } [Command<DeleteAnswersCommand>] public DeleteAnswers? DeleteAnswers { get; set; } [Command<CreateAnswerCommand>] public Post? CreateAnswer { get; set; } [Command<PostSubscriptionsCommand>] public PostSubscriptions? PostSubscriptions { get; set; } [Command<TagSubscriptionsCommand>] public TagSubscriptions? TagSubscriptions { get; set; } //... } ``` Then to execute the commands we can use the `Request.ExecuteCommandsAsync` extension method for its Background MQ API implementation: ```csharp public class BackgroundMqServices : Service { public Task Any(DbWrites request) => Request.ExecuteCommandsAsync(request); } ``` Which goes through all Request DTO properties to execute all populated properties with their associated command, using it as the request for the command. So after registering the `DbWrites` Command DTO with the MQ Service: ```csharp mqService.RegisterHandler<DbWrites>(appHost.ExecuteMessage); ``` We can now publish a single `DbWrites` message to execute multiple commands in a single managed background thread: ```csharp public class NotificationServices(MessageProducer mq) : Service { public object Any(Watch request) { var userName = Request.GetClaimsPrincipal().GetUserName(); mq.Publish(new DbWrites { PostSubscriptions = request.PostId == null ? null : new() { UserName = userName, Subscriptions = [request.PostId.Value], }, TagSubscriptions = request.Tag == null ? null : new() { UserName = userName, Subscriptions = [request.Tag], }, }); mq.Publish(new AnalyticsTasks { WatchRequest = request, }); } } ``` We also benefit from its natural parallelism where write requests to different Databases are executed in parallel. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/commonjs-add-servicestack-reference.md --- title: ES3 Common.js Add ServiceStack Reference --- In addition to [TypeScript](/typescript-add-servicestack-reference) support for generating typed Data Transfer Objects (DTOs), JavaScript is now supported. Unlike TypeScript, JavaScript generated DTOs can be used directly from the browser, removing the need to keep your DTOs in sync with extra tooling by including a direct reference in your HTML Page: ```html <script src="/types/js"></script> ``` To make typed API Requests from web pages, you need only include: - **/js/require.js** - containing a simple `require()` to load **CommonJS** libraries - **/js/servicestack-client.js** - [built-in UMD @servicestack/client](/servicestack-client-umd) in **ServiceStack.dll** - **/types/js** - containing your APIs typed JS DTOs - all built-in ServiceStack After which you'll have access to the generic `JsonServiceClient` with your APIs Typed Request DTOs, e.g: ```html <script src="/js/require.js"></script> <script src="/js/servicestack-client.js"></script> <script src="/types/js"></script> <script> var { JsonServiceClient, Hello } = exports var client = new JsonServiceClient() client.api(new Hello({ name })) .then(api => console.log(api.response)) </script> ``` Using **/types/js** has the same behavior as using `dtos.js` generated from `$ tsc dtos.ts` whose outputs are identical, i.e. both containing your API DTOs generated in CommonJS format. It's feasible to simulate the TypeScript compiler's output in this instance as ServiceStack only needs to generate DTO Types and Enums to enable its end-to-end API, and not any other of TypeScript's vast feature set. ### Enhanced Dev Time productivity with TypeScript Even when no longer using TypeScript DTOs in your Apps, it's still useful to have TypeScript's `dtos.ts` included in your project (inc. Vanilla JS projects) to serve as optional type annotations enabling rich intelli-sense and static analysis in IDEs that support it, but as it's no longer used at runtime you're free to generate it at optimal times that don't interrupt your dev workflow. ## DTO Customization Options In most cases you'll just use the generated JavaScript DTO's as-is, however you can further customize how the DTOs are generated by overriding the default options. The header in the generated DTOs show the different options JavaScript types support with their defaults. Default values are shown with the comment prefix of `//`. To override a value, remove the `//` and specify the value to the right of the `:`. Any uncommented value will be sent to the server to override any server defaults. The DTO comments allows for customizations for how DTOs are generated. The default options that were used to generate the DTOs are repeated in the header comments of the generated DTOs, options that are preceded by a TypeScript comment `//` are defaults from the server, any uncommented value will be sent to the server to override any server defaults. ```js /* Options: Date: 2022-01-28 02:10:26 Version: 6.00 Tip: To override a DTO option, remove "//" prefix before updating BaseUrl: https://vue-static.web-templates.io //AddServiceStackTypes: True //AddDescriptionAsComments: True //IncludeTypes: //ExcludeTypes: //DefaultImports: */ ``` We'll go through and cover each of the above options to see how they affect the generated DTO's: ### Change Default Server Configuration Above defaults are also overridable on the ServiceStack Server by modifying the default config on the `NativeTypesFeature` Plugin, e.g: ```csharp //Server example in CSharp var nativeTypes = this.GetPlugin<NativeTypesFeature>(); nativeTypes.MetadataTypesConfig.IgnoreTypesInNamespaces = "test"; ... ``` We'll go through and cover each of the above options to see how they affect the generated DTO's: ### IncludeTypes Is used as a Whitelist to specify only the types you would like to have code-generated: ``` /* Options: IncludeTypes: Hello, HelloResponse ``` Will only generate `Hello` and `HelloResponse` DTOs: ```csharp var HelloResponse = /** @class */ (function () { ... }()); exports.HelloResponse = HelloResponse; var Hello = /** @class */ (function () { ... }()); exports.Hello = Hello; ``` #### Include Generic Types Use .NET's Type Name to include Generic Types, i.e. the Type name separated by the backtick followed by the number of generic arguments, e.g: ``` IncludeTypes: IReturn`1,MyPair`2 ``` #### Include Request DTO and its dependent types You can include a Request DTO and all its dependent types with a `.*` suffix on the Request DTO, e.g: ``` /* Options: IncludeTypes: GetTechnology.* ``` Which will include the `GetTechnology` Request DTO, the `GetTechnologyResponse` Response DTO and all Types that they both reference. #### Include All Types within a C# namespace If your DTOs are grouped into different namespaces they can be all included using the `/*` suffix, e.g: ``` /* Options: IncludeTypes: MyApp.ServiceModel.Admin/* ``` This will include all DTOs within the `MyApp.ServiceModel.Admin` C# namespace. #### Include All Services in a Tag Group Services [grouped by Tag](/api-design#group-services-by-tag) can be used in the `IncludeTypes` where tags can be specified using braces in the format `{tag}` or `{tag1,tag2,tag3}`, e.g: ``` /* Options: IncludeTypes: {web,mobile} ``` Or individually: ``` /* Options: IncludeTypes: {web},{mobile} ``` ### ExcludeTypes Is used as a Blacklist to specify which types you would like excluded from being generated: ``` /* Options: ExcludeTypes: GetTechnology,GetTechnologyResponse ``` Will exclude `GetTechnology` and `GetTechnologyResponse` DTOs from being generated. ### Cache When using `/types/js` directly from a `script` tag, the server will cache the result by default when not running in `HostContext.DebugMode`. This caching process can be disabled if required by using `?cache=false`. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/compiled-razor-views.md --- slug: compiled-razor-views title: Compiled Razor Views --- The primary benefits of compiled views is improved performance by eliminating compile times of Razor views. They can also provide static compilation benefits by highlighting compile errors during development and can simplify deployment by avoiding the need to deploy any `*.cshtml` files as they end up pre-compiled in the containing Assembly. ### Install ServiceStack.Razor.BuildTask To enable compiled razor views you need to add the [ServiceStack.Razor.BuildTask](https://www.nuget.org/packages/ServiceStack.Razor.BuildTask) NuGet Package to the project containing your Razor `*.cshtml` pages, i.e: :::copy `<PackageReference Include="ServiceStack.Razor.BuildTask" Version="5.*" />` ::: This doesn't add any additional dlls to your project, instead it just sets the **BuildAction** to all `*.cshtml` pages to **Content** and registers an MSBuild task to your `.csproj` project file set to pre-compile razor views on every build. ### Register Compiled Assembly to RazorFormat Plugin To register assemblies containing compiled razor views with Razor Format you just need to add it to RazorFormat.LoadFromAssemblies, e.g: ```csharp Plugins.Add(new RazorFormat { LoadFromAssemblies = { typeof(RockstarsService).Assembly } }); ``` ### Retains optimal development workflow The Compiled Views support continues to retain a great development experience in [DebugMode](/debugging#debugmode) as all Razor Views are initially loaded from the Assembly but also continues to monitor the file system for modified views, automatically compiling and loading them on the fly so AppDomain reloads aren't required to see changes. ## Example Projects ### [Razor Rockstars](https://github.com/ServiceStackApps/RazorRockstars) The [RazorRockstars.CompiledViews](https://github.com/ServiceStackApps/RazorRockstars/tree/master/src/RazorRockstars.CompiledViews) VS.NET project shows an example of [Razor Rockstars](https://razor.netcore.io/) which uses shared compiled Razor Views in a `.dll` in a number of different projects: - [WPF Host](https://github.com/ServiceStackApps/RazorRockstars/tree/master/src/RazorRockstars.CompiledViews.WpfHost) - [HttpListener SelfHost](https://github.com/ServiceStackApps/RazorRockstars/tree/master/src/RazorRockstars.CompiledViews.SelfHost) - [ASP.NET WebHost](https://github.com/ServiceStackApps/RazorRockstars/tree/master/src/RazorRockstars.CompiledViews.WebHost) ### [ServiceStack.Gap](https://github.com/ServiceStack/ServiceStack.Gap) The [ServiceStack.Gap](https://github.com/ServiceStack/ServiceStack.Gap) project shows how to extend Compiled Razor Views and use them to create embedded ServiceStack solutions that can be ILMerged down to a single `.exe`. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ormlite/complex-type-serializers.md --- title: Pluggable Complex Type Serializers --- Pluggable serialization lets you specify different serialization strategies of Complex Types for each available RDBMS provider, e.g: ## Defaults ```csharp //ServiceStack's JSON and JSV Format SqliteDialect.Provider.StringSerializer = new JsvStringSerializer(); PostgreSqlDialect.Provider.StringSerializer = new JsonStringSerializer(); //.NET's XML and JSON DataContract serializers SqlServerDialect.Provider.StringSerializer = new DataContractSerializer(); MySqlDialect.Provider.StringSerializer = new JsonDataContractSerializer(); //.NET XmlSerializer OracleDialect.Provider.StringSerializer = new XmlSerializableSerializer(); ``` You can also provide a custom serialization strategy by implementing [IStringSerializer](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack.Text/src/ServiceStack.Text/IStringSerializer.cs). By default, all dialects use the existing `JsvStringSerializer`, except for PostgreSQL which due to its built-in support for JSON, uses the JSON format by default. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/compression.md --- title: Compression --- ## Client/Server Request Compression In addition to [optimized cached Server Responses](/http-caching#server-caching) you can also elect to compress HTTP Requests in any C#/.NET Service Clients by specifying the Compression Type you wish to use, e.g: ```csharp var client = new JsonServiceClient(baseUrl) { RequestCompressionType = CompressionTypes.GZip, }; var client = new JsonHttpClient(baseUrl) { RequestCompressionType = CompressionTypes.Deflate, }; var response = client.Post(new Request { ... }); ``` Where sending any HTTP Request containing a Request Body (e.g. POST/PUT) will send a compressed Request body to the Server where it's now able to be transparently decompressed and deserialized into your Request DTO. ## `[CompressResponse]` Attribute You can now selectively choose which Services should be compressed with the new `[CompressResponse]` attribute to compress responses for clients which support compression, which can be applied to most Response Types, e.g: ```csharp [CompressResponse] public class CompressedServices : Service { public object Any(CompressDto request) => new CompressExamplesResponse(); public object Any(CompressString request) => "foo"; public object Any(CompressBytes request) => "foo".ToUtf8Bytes(); public object Any(CompressStream request) => new MemoryStream("foo".ToUtf8Bytes()); public object Any(CompressFile request) => new HttpResult(VirtualFileSources.GetFile("/foo")); public object Any(CompressAnyHttpResult request) { return new HttpResult(new CompressExamplesResponse()); // DTO return new HttpResult("foo", "text/plain"); // string return new HttpResult("foo".ToUtf8Bytes(), "text/plain"); // bytes //etc } } ``` ::: info using `[CompressResponse]` is unnecessary when returning [cached responses](/http-caching) as ServiceStack automatically caches and returns the most optimal Response - typically compressed bytes for clients that supports compression ::: ## Static File Compression ServiceStack can also be configured to compress static files with specific file extensions that are larger than specific size with the new opt-in Config options below: ```csharp SetConfig(new HostConfig { CompressFilesWithExtensions = { "js", "css" }, // (optional), only compress .js or .css files > 10k CompressFilesLargerThanBytes = 10 * 1024 }); ``` When more fine-grained logic is needed you can override `ShouldCompressFile()` in your AppHost to choose which static files you want to compress on a per-file basis, e.g: ```csharp public override bool ShouldCompressFile(IVirtualFile file) { return base.ShouldCompressFile(file) || file.Name == "large.csv"; } ``` #### When to enable Static File Compression It's more optimal to configure static file compression on the native Web Server that's hosting your ServiceStack App than in managed code. You can use [Fiddler](http://www.telerik.com/fiddler) to check if your Web Server (e.g. IIS) is already compressing static files in which case you won't want to configure ServiceStack to do it. No compression is added when running ServiceStack in a self-host, which will benefit from enabling Static File Compression. ## Brotli Compression **.NET 10+** Apps have access to is .NET Core's `BrotliStream` which is fully supported throughout ServiceStack, e.g. in Cached & Compressed Responses as well as sending compressed Request payloads in Service Clients. The Brotli implementation is encapsulated within ServiceStack's compression abstractions whose implementations are contained within: - **BrotliCompressor** - Brotli (br) - **DeflateCompressor** - Deflate (deflate) - **GZipCompressor** - GZIP (gzip) Which all implement the same substitutable interface: ```csharp public interface IStreamCompressor { string Encoding { get; } byte[] Compress(string text, Encoding? encoding = null); byte[] Compress(byte[] bytes); Stream Compress(Stream outputStream, bool leaveOpen=false); string Decompress(byte[] zipBuffer, Encoding? encoding = null); Stream Decompress(Stream zipBuffer, bool leaveOpen=false); byte[] DecompressBytes(byte[] zipBuffer); } ``` That are managed with `StreamCompressors` in the **ServiceStack.Client** package: ```csharp public static class StreamCompressors { // Is there a compressor registered with this encoding? public static bool SupportsEncoding(string? encoding); // return the registered IStreamCompressor implementation for for this public static IStreamCompressor? Get(string? encoding); // Assert there exists a IStreamCompressor for this encoding public static IStreamCompressor GetRequired(string encoding); // Register a new compressor for a specific encoding (defaults: gzip, deflate, br*) .NET6+ public static void Set(string encoding, IStreamCompressor compressor); // Remove compression support for this encoding public static bool Remove(string encoding); } ``` Containing pre-registered implementations of all popular Brotli, Deflate & gzip HTTP Compression algorithms so there's typically no need to add any yourself. The preferred compression implementation for a request can be retrieved with `IRequest.GetCompressor()` which determines the implementation to use based on the overridable `GetCompressionType(IRequest request)` method in your AppHost. ### Brotli disabled for Firefox Brotli is currently not returned for Firefox browsers (by **UserAgent** detection in `AppHost.GetCompressionType()`) which for a yet to be determined reason is the only modern browser that doesn't support .NET's `BrotliStream` output. We'll continue to investigate and remove the restriction when resolved. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/concurrency-model.md --- title: Concurrency Model --- ServiceStack doesn't have a configurable concurrency model per AppHost, it is dependent upon the AppHost that your ServiceStack services are hosted with: ## ASP.NET Host (AppHostBase) For ASP.NET web hosts, ServiceStack **doesn't create any new threads** itself, the requests are simply handled on the same IIS/Nginx/etc ASP.NET HTTP WebWorker that handles the request. ## HttpListener Self-Host (AppSelfHostBase) The default Self-Host HttpListener option for ServiceStack that executes requests on the [SmartThreadPool](http://www.codeproject.com/Articles/7933/Smart-Thread-Pool) managed ThreadPool. By default it executes on `Environment.ProcessorCount * 2` or maximum of 16 worker threads. See this chart for the [performance of the different ServiceStack Hosts](https://github.com/ServiceStack/ServiceStack/blob/master/release-notes.md#new-much-faster-self-host). ## HttpListener Pool Self-Host (AppHostHttpListenerPoolBase) This is another Self-Host HttpListener option for ServiceStack that uses its own managed ThreadPool to execute requests on (free-ing up the HttpListener async callback thread). The default poolSize of the ThreadPool is **500** threads, though this is configurable in the `AppHostHttpListenerPoolBase(serviceName, handlerPath, poolSize, assembliesWithServices)` constructor. ## HttpListener Single Self-Host (AppHostHttpListenerBase) ServiceStack only creates a new thread on **Startup** when you call `new AppHost().Start(url)`. There are no new threads created at run-time, i.e. the request is handled on the HttpListener async callback thread. ## RedisMQ Host (RedisMqServer) A good option for managing long-running tasks is to delegate requests to a [Redis MQ Host](/redis-mq) which is a light-weight MQ Server allowing you to defer and process requests in managed background threads. By default the RedisMqServer spawns a single background thread for each Message type (i.e. Request), though this is configurable on start-up, e.g: in the example below **2 background threads** are used to handle `PostTwitter` requests, whilst only 1 background thread each is used to process `CallFacebook` and `EmailMessage` requests: ```csharp mqServer.RegisterHandler<PostTwitter>(ServiceController.ExecuteMessage, noOfThreads:2); mqServer.RegisterHandler<CallFacebook>(ServiceController.ExecuteMessage); mqServer.RegisterHandler<EmailMessage>(ServiceController.ExecuteMessage); ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/config-api.md --- slug: config-api title: Config API --- Despite being avid protesters in the anti-XML config movement, we're still 100% for app Config in general though it should be **limited to what's actually configurable by your application**. Instead of building tiered configSection manatees we prefer to store structured data in Web.config's appSetting's values which are still able to express rich object config graphs but does so in a much more human-friendly and manageable size. ## ServiceStack's Configuration API To this end we provide our own pluggable [Configuration API](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Configuration/IResourceManager.cs) to provide high-level utility methods to read your Web.config's `<appSetting/>` values into a `List`, `Dictionary` or your own Custom POCO Type using the human friendly [JSV format](/jsv-format). ### Benefits over XML Config Benefits over existing XML Configuration APIs include: - The ability to store rich data structures in **appSettings** values - Much easier and requires less effort and boilerplate to create - Provides more succinct access to typed data - Since they're just POCOs can be re-used in all of ServiceStack's libraries and built-in [Auto Mapping](/auto-mapping) and promotes less-coupling since its only an [interface](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Configuration/IResourceManager.cs) so can easily be swapped to have [Plugins](/plugins) source their complex configuration from an different source (e.g. from a central DB) without a rewrite. [OpenId](/auth/openid) providers like the [FacebookAuthProvider](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.ServiceInterface/Auth/FacebookAuthProvider.cs#L23) is an example of Plugins that require multiple configuration settings but remain de-coupled from any one configuration source (e.g. Web.config). ### Example AppSettings Usage By default ServiceStack ships with an [AppSettings](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Configuration/AppSettings.cs) which reads from your Web.Config `<appSettings/>` and a [DictionarySettings](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Configuration/DictionarySettings.cs) provider which can be populated with a standard C# `Dictionary<string,string>`. Here's a quick example show-casing how to use the popular **AppSettings*: ```xml <appSettings> <add key="LastUpdated" value="01/01/2012 12:00:00" /> <add key="AllowedUsers" value="Tom,Mick,Harry" /> <add key="RedisConfig" value="{Host:localhost,Port:6379,Database:1,Timeout:10000}" /> </appSettings> ``` Accessing the above appSettings in C#: ```csharp var appSettings = new AppSettings(); DateTime lastUpdate = appSettings.Get<DateTime>("LastUpdated"); IList<string> allowedUsers = appSettings.GetList("AllowedUsers"); var redisConf = appSettings.Get<RedisConfig>("RedisConf"); //use default value if no config exists var searchUrl = appSettings.Get("SearchUrl", "http://www.google.com"); ``` ### Default configuration in code The default value support is nice as it allows having workable default options in code whilst still remaining overridable in the **Web.config** when it needs to. This allows local and test projects to work without duplicating and maintaining and their own Web.config files whilst allowing arbitrary settings to be overridable in different deployment environments. It also allows distributing Stand-alone Console applications like the [PocoPower demo](https://github.com/ServiceStack/ServiceStack.UseCases/blob/master/PocoPower/Program.cs) but still provide the opportunity to override the settings without recompiling the source, e.g: ```csharp var appSettings = new AppSettings(); var config = appSettings.Get("my.config", new Config { GitHubName = "mythz", TwitterName = "ServiceStack" }); var github = new GithubGateway(); var repos = github.GetAllUserAndOrgsReposFor(config.GitHubName); var twitter = new TwitterGateway(); var tweets = twitter.GetTimeline(config.TwitterName); ``` ## Easy to implement Despite being so versatile, it's surprisingly easy to implement a new Configuration Provider, e.g. Here's the entire implementation for [DictionarySettings](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Configuration/DictionarySettings.cs) which just needs to implement [ISettings](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Configuration/ISettings.cs) as is able to re-use the built-in `AppSettingsBase` base class: ```csharp public class DictionarySettings : AppSettingsBase, ISettings { private readonly Dictionary<string, string> map; public DictionarySettings(Dictionary<string, string> map=null) { this.map = map ?? new Dictionary<string, string>(); settings = this; } public string Get(string key) { string value; return map.TryGetValue(key, out value) ? value : null; } } ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/ai-server/configuration.md --- title: Configuring AI Server --- AI Server can be configured in several ways: - **install.sh Script**: Run the `install.sh` script to set up the AI Server and ComfyUI Agent. - **.env File**: Update the `.env` file with your API keys and run the AI Server for the first time. - **Admin Portal**: Use the Admin Portal to add, edit, or remove AI Providers and generate AI Server API keys. ## Running the Installer The `install.sh` script is the quickest way to get AI Server up and running with the default configuration. This is ideal for local development and testing. To run the installer: ```sh git clone https://github.com/ServiceStack/ai-server.git cd ai-server cat install.sh | bash ``` The installer will prompt you to configure your AI Providers and optionally add the ComfyUI Agent. ## `.env` Configuration The installer populates the `.env` file with the choices you made during the installation script. You can also manually configure the `.env` file with your API keys and settings. ```ini OPENAI_API_KEY=<Your OpenAI API Key> ANTHROPIC_API_KEY=<Your Anthropic API Key> OPENROUTER_API_KEY=<Your OpenRouter API Key> MISTRAL_API_KEY=<Your Mistral API Key> GOOGLE_API_KEY=<Your GoogleCloud API Key> GROQ_API_KEY=<Your Groq Cloud API Key> AUTH_SECRET="p@55wOrd" ASSETS_BASE_URL="http://localhost:5006" ``` After these values are set in your `.env` file, you can run the AI Server for the first time via docker compose: ```sh docker compose up ``` This will perform an initial setup, saving providers configuration in the SQLite database. From here, you can manage your AI Providers via the [Admin Portal](http://localhost:5006/admin). :::info The default credentials to access the Admin Portal are `p@55wOrd`, this can be changed in your `.env` file by setting the `AUTH_SECRET` key. ::: ### Using the Admin Portal The Admin Portal provides a more interactive way to manage your AI Providers after the AI Server is running. To access the Admin Portal: 1. Navigate to [http://localhost:5006/admin](http://localhost:5005/admin). 2. Log in with the default credentials `p@55wOrd`. 3. Click on the **AI Providers** tab to view and manage your AI Providers. Here you can add, edit, or remove AI Providers, as well as generate API keys for each provider. AI Server supports the following AI Providers: - **OpenAI**: OpenAI Chat API - **Anthropic**: Anthropic Claude API - **Google**: Google Cloud AI - **OpenRouter**: OpenRouter API - **Mistral**: Mistral API - **GROQ**: GROQ API - **Ollama**: Ollama API Media Providers can also be configured in the Admin Portal. These include: - **ComfyUI**: ComfyUI Agent - **Image Generation** - **Text-to-Speech** - **Speech-to-Text** - **Video & Image Processing** - **Replicate**: Replicate API - **Image Generation** - **OpenAI**: OpenAI API - **Image Generation** - **Text-to-Speech** ## Register AI Providers To let AI Server know about your API Providers or self-hosted instances by creating them in the **AI Provider** section where you can use **Custom** AI Provider type to register any OpenAI Chat compatible endpoint, e.g: [![](/img/pages/ai-server/custom-openai-provider.webp)](/ai-server/) After registering AI Providers you can try to use them in the [Chat UI](/ai-server/chat): [![](/img/pages/ai-server/custom-openai-provider-chat.webp)](/ai-server/chat) ## Create API Keys for your Apps After testing the llama-server instance is working with the Chat UI it's time to create API Keys for all your Apps so they can access AI Servers APIs with the [API Keys UI](/auth/admin-apikeys#api-keys-admin-ui): ![](/img/pages/ai-server/admin-apikeys.webp) Here you can create new API keys, view existing keys, and revoke keys as needed. Keys can be created with expiration dates, and restrictions to specific API endpoints, along with notes to help identify the key's purpose. It's recommended to use a different API Key per App so they can be monitored and analyzed separately. With a valid API Key in hand your App's can use AI Server's DTOs with ServiceStack generic service clients to enable typed integrations in [11 different languages](/ai-server/openai-chat-all-languages). ## Stored File Management AI Server stores results of the AI operations in a pre-configured paths. - **Artifacts**: AI generated images, audio, and video files, default path is `App_Data/artifacts`. - **Files**: Cached variants and processed files, default path is `App_Data/files`. These paths can be configured in the `.env` file by setting the `ARTIFACTS_PATH` and `AI_FILES_PATH` keys. ## Custom Definitions AI Server's knowledge is limited to the AI Provider and Model types and definitions defined in its [/data](https://github.com/ServiceStack/ai-server/tree/main/AiServer/wwwroot/lib/data) definitions. These definitions are merged and can be extended with custom definitions you can create in your `/App_Data/overrides/` folder, e.g: ```files /App_Data /overrides ai-models.json ai-types.json generation-types.json media-models.json media-types.json prompts.json tts-voices.json ``` --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/razor-press/containers.md --- title: Custom Markdown Containers --- [Custom Containers](https://github.com/xoofx/markdig/blob/master/src/Markdig.Tests/Specs/CustomContainerSpecs.md) are a popular method for implementing Markdown Extensions for enabling rich, wrist-friendly consistent content in your Markdown documents. ## Built-in Containers Most of [VitePress Containers](https://vitepress.dev/guide/markdown#custom-containers) are also implemented in Razor Press, e.g: #### Input ```markdown ::: info This is an info box. ::: ::: tip This is a tip. ::: ::: warning This is a warning. ::: ::: danger This is a dangerous warning. ::: ``` #### Output ::: info This is an info box. ::: ::: tip This is a tip. ::: ::: warning This is a warning. ::: ::: danger This is a dangerous warning. ::: ### Custom Title You can specify a custom title by appending the text right after the container type: #### Input ```markdown ::: danger STOP Danger zone, do not proceed ::: ``` #### Output ::: danger STOP Danger zone, do not proceed ::: ### Pre The **pre** container can be used to capture its content in a `<pre>` element instead of it's default markdown rendering: ```markdown :::pre ... ::: ``` ### copy The **copy** container is ideal for displaying text snippets in a component that allows for easy copying: #### Input ```markdown :::copy Copy Me! ::: ``` #### Output :::copy Copy Me! ::: HTML or XML fragments can also be copied by escaping them first: #### Input ```markdown :::copy `<PackageReference Include="ServiceStack" Version="10.*" />` ::: ``` #### Output :::copy `<PackageReference Include="ServiceStack" Version="10.*" />` ::: ### sh Similarly the **sh** container is ideal for displaying and copying shell commands: #### Input ```markdown :::sh npm run prerender ::: ``` #### Output :::sh npm run prerender ::: ## Implementing Block Containers [Markdig Containers](https://github.com/xoofx/markdig/blob/master/src/Markdig.Tests/Specs/CustomContainerSpecs.md) are a great way to create rich widgets that can be used directly in Markdown. They're useful for ensuring similar content is displayed consistently across all your documentation. A good use-case for this could be to implement a YouTube component for standardizing how YouTube videos are displayed. For this example we want to display a YouTube video using just its YouTube **id** and a **title** for the video which we can capture in the Custom Container: ```markdown :::YouTube MRQMBrXi5Sc Using Razor SSG to Create Websites in GitHub Codespaces ::: ``` Which we can implement with a normal Markdig `HtmlObjectRenderer<CustomContainer>`: ```csharp public class YouTubeContainer : HtmlObjectRenderer<CustomContainer> { protected override void Write(HtmlRenderer renderer, CustomContainer obj) { if (obj.Arguments == null) { renderer.WriteLine($"Missing YouTube Id, Usage :::{obj.Info} <id>"); return; } renderer.EnsureLine(); var youtubeId = obj.Arguments!; var attrs = obj.TryGetAttributes()!; attrs.Classes ??= new(); attrs.Classes.Add("not-prose text-center"); renderer.Write("<div").WriteAttributes(obj).Write('>'); renderer.WriteLine("<div class=\"text-3xl font-extrabold tracking-tight\">"); renderer.WriteChildren(obj); renderer.WriteLine("</div>"); renderer.WriteLine(@$"<div class=""mt-3 flex justify-center""> <lite-youtube class=""w-full mx-4 my-4"" width=""560"" height=""315"" videoid=""{youtubeId}"" style=""background-image:url('https://img.youtube.com/vi/{youtubeId}/maxresdefault.jpg')""> </lite-youtube> </div> </div>"); } } ``` That should be registered in `Configure.Ssg.cs` with the name we want to use for the container: ```csharp MarkdigConfig.Set(new MarkdigConfig { ConfigureContainers = config => { // Add Custom Block or Inline containers config.AddBlockContainer("YouTube", new YouTubeContainer()); } }); ``` After which it can be used in your Markdown documentation: #### Input ```markdown :::YouTube MRQMBrXi5Sc Using Razor SSG to Create Websites in GitHub Codespaces ::: ``` #### Output :::YouTube MRQMBrXi5Sc Using Razor SSG to Create Websites in GitHub Codespaces ::: ### Custom Attributes Since we use `WriteAttributes(obj)` to emit any attributes we're also able to customize the widget to use a custom **id** and classes, e.g: #### Input ```markdown :::YouTube MRQMBrXi5Sc {.text-indigo-600} Using Razor SSG to Create Websites in GitHub Codespaces ::: ``` #### Output :::YouTube MRQMBrXi5Sc {.text-indigo-600} Using Razor SSG to Create Websites in GitHub Codespaces ::: ## Implementing Inline Containers Custom Inline Containers are useful when you don't need a to capture a block of content, like if we just want to display a video without a title, e.g: ```markdown ::YouTube MRQMBrXi5Sc:: ``` Inline Containers can be implemented with a Markdig `HtmlObjectRenderer<CustomContainerInline>`, e.g: ```csharp public class YouTubeInlineContainer : HtmlObjectRenderer<CustomContainerInline> { protected override void Write(HtmlRenderer renderer, CustomContainerInline obj) { var youtubeId = obj.FirstChild is Markdig.Syntax.Inlines.LiteralInline literalInline ? literalInline.Content.AsSpan().RightPart(' ').ToString() : null; if (string.IsNullOrEmpty(youtubeId)) { renderer.WriteLine($"Missing YouTube Id, Usage ::YouTube <id>::"); return; } renderer.WriteLine(@$"<div class=""mt-3 flex justify-center""> <lite-youtube class=""w-full mx-4 my-4"" width=""560"" height=""315"" videoid=""{youtubeId}"" style=""background-image:url('https://img.youtube.com/vi/{youtubeId}/maxresdefault.jpg')""> </lite-youtube> </div>"); } } ``` That can be registered in `Configure.Ssg.cs` with: ```csharp MarkdigConfig.Set(new MarkdigConfig { ConfigureContainers = config => { // Add Custom Block or Inline containers config.AddInlineContainer("YouTube", new YouTubeInlineContainer()); } }); ``` Where it can then be used in your Markdown documentation: #### Input ```markdown ::YouTube MRQMBrXi5Sc:: ``` #### Output ::YouTube MRQMBrXi5Sc:: --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/templates/corefx.md --- title: Run ASP.NET Core Apps on the .NET Framework slug: templates-corefx --- ![](https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/web.png) ### Status of ASP.NET Core on .NET Framework The last important Microsoft announcement concerning the future of ASP.NET Core 2.1 was that it would stop supporting [new versions of ASP.NET Core on the .NET Framework](https://github.com/aspnet/AspNetCore/issues/3753). Whilst we disagreed against this decision which would've put out a large class of the existing ecosystems from participating in the new ASP.NET Core development model and many from staged migrations to .NET Core from commencing, we're happy to see [ASP.NET Core 2.1 LTS will enjoy the same indefinite level of support](https://github.com/aspnet/AspNetCore/issues/3753#issuecomment-438046364) as the rest of the .NET Framework - which should come as great news to the 1/3 of our Customers who are still creating new [ASP.NET Core on FX Project Templates](https://github.com/NetFrameworkCoreTemplates). Whilst this announcement sends a clear message that new development on .NET Framework has effectively been put on product life support, **ASP.NET Core 2.1 LTS** is still a great rock-solid platform to build on if you're unable to jump directly to **.NET Core** immediately or if you want to get off .NET Core's major version release train and build upon a stable LTS platform. ### ASP.NET Core - still our top recommendation for .NET Framework If you need to stay on the .NET Framework, we'd still recommend using the newer **ASP.NET Core 2.1** over classic **ASP.NET System.Web** projects as it's cleaner, lighter, more flexible and future proof. Unlike Microsoft web frameworks, ServiceStack is a **single code-base** which supports running on [multiple platforms](/why-servicestack#multiple-hosting-options) so your ServiceStack Services can enjoy near perfect source-code compatibility when and if you choose to **move to .NET Core** in future. Whilst Microsoft is stopping new development of ASP.NET Core on .NET Framework, we're not, our supported packages have standardized to multi-target both **.NET v4.5+** and **.NET Standard 2.0** which is supported natively on **ASP.NET Core 2.1**. ### Future proofed and continually developed This includes our own innovations that we continue to invest in like [#Script](https://sharpscript.net) (fka ServiceStack Templates) naturally support .NET Framework and .NET Core and runs everywhere ServiceStack does including within [classic ASP.NET MVC Controllers](https://sharpscript.net/docs/mvc-netcore) which wasn't a design goal but was a natural consequence of developing clean libraries without external dependencies or reliance on external tooling. This is to say that **ASP.NET Core 2.1 LTS** is still a fantastic rock-solid platform to run your .NET Framework workloads when you need to which will continue to receive enhancements and new features with each ServiceStack release courtesy of being derived from the same shared code-base which will enable seamless migrations to .NET Core should you wish to in future. #### Start from pre-configured Project Templates Ultimately you'll miss out on niceties like the [Microsoft.AspNetCore.App](https://www.nuget.org/packages/Microsoft.AspNetCore.App) meta-package, as a result we recommend starting from one of our [ASP.NET Core Framework project Templates](https://github.com/NetFrameworkCoreTemplates) which by convention all have the `-corefx` suffix: <WebNewCorefxMd></WebNewCorefxMd> #### Usage This will let you create an ASP.NET Core App running on the .NET Framework v4.7 using [web new](/web-new) with: ```bash $ dotnet tool install --global web $ x new web-corefx AcmeNetFx ``` Which can then be opened in your preferred VS.NET or Project Rider C# IDE. ### Reference .Core packages The primary difference between ASP.NET Core Apps on **.NET Core 2.1** vs **.NET Framework** is needing to reference the `.Core` packages to force referencing ServiceStack **.NET Standard 2.0** libraries, which otherwise when installed in a .NET Framework project would install `net45` libraries. The differences between the 2 builds include: - `net45` - Contains support for running **ASP.NET** Web or Self-Hosting **HttpListener** App Hosts - `netstandard2.0` - Contains support for only running on **ASP.NET Core** App Hosts In order to run ASP.NET Core Apps on the .NET Framework it needs to only reference `.Core` NuGet packages which contains only the **.NET Standard 2.0** builds. Currently the list of `.Core` packages which contains only **.NET Standard 2.0** builds include: - ServiceStack.Text.Core - ServiceStack.Interfaces.Core - ServiceStack.Client.Core - ServiceStack.HttpClient.Core - ServiceStack.Core - ServiceStack.Common.Core - ServiceStack.Mvc.Core - ServiceStack.Server.Core - ServiceStack.Redis.Core - ServiceStack.OrmLite.Core - ServiceStack.OrmLite.Sqlite.Core - ServiceStack.OrmLite.SqlServer.Core - ServiceStack.OrmLite.PostgreSQL.Core - ServiceStack.OrmLite.MySql.Core - ServiceStack.OrmLite.MySqlConnector.Core - ServiceStack.Aws.Core - ServiceStack.Azure.Core - ServiceStack.RabbitMq.Core - ServiceStack.Api.OpenApi.Core - ServiceStack.Admin.Core - ServiceStack.Stripe.Core - ServiceStack.Logging.Log4Net.Core - ServiceStack.Logging.NLog.Core - ServiceStack.Kestrel.Core ::: warning Ultimately support for whether a **.NET Standard 2.0** library will run on the .NET Framework depends on whether external dependencies also support this scenario which as it's a more niche use-case, will be a less tested scenario ::: ## Troubleshooting ### Resolving Runtime Assembly Loading Issues Many of Microsoft's core **.NET Standard 2.0** packages have been reported to have runtime Assembly loading issues that throw `FileNotFoundException` "Could not load file or assembly ..." Exceptions in .NET Framework projects, including: - `System.Runtime` - `System.Runtime.CompilerServices.Unsafe` - `System.Runtime.InteropServices.RuntimeInformation` - `System.Memory` - `System.Buffers` - `System.Numerics.Vectors` - `netstandard` Some solutions that have been known to resolve these issues include: 1. Adding the package, e.g. `System.Runtime.CompilerServices.Unsafe` reference directly on the Host project, [for netstandard](https://github.com/dotnet/standard/issues/328#issuecomment-299577190) the package is [NETStandard.Library.NETFramework](https://www.nuget.org/packages/NETStandard.Library.NETFramework) or installing .NET Core 2.0 SDK. 2. Manually Adding Binding Redirect, see: - [System.Runtime](https://stackoverflow.com/a/52250140/85785) - [System.Runtime.CompilerServices.Unsafe](https://stackoverflow.com/a/55329952/85785) - [System.Runtime.InteropServices.RuntimeInformation](https://stackoverflow.com/a/52637120/85785) - [System.Numerics.Vectors](https://github.com/dotnet/corefx/issues/30106#issuecomment-395248278) - [System.Net.Http](https://stackoverflow.com/a/48867478/85785) 3. [Install the missing .dll into the GAC](https://stackoverflow.com/a/62770487/85785) 4. If you had an existing binding redirect, try removing it 5. Adding `<AutoGenerateBindingRedirects>true</autoquery/autogenerateBindingRedirects>` to your project's .csproj 6. Uninstalling and Reinstalling the problem packages from your projects 7. Clean Solution and remove project artifacts, including Nuget `/packages` and project `/bin` and `/obj` folders 8. Upgrading to the latest version of the .NET Framework (v4.7.2+) Many of these issues is the result of [older .NET Frameworks like .NET v4.6.1](https://github.com/dotnet/standard/issues/481) not properly supporting .NET Standard 2.0 which is mostly resolved by installing .NET Framework v4.7.1+. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/corsfeature.md --- slug: corsfeature title: CORS Feature --- ### Enable CORS in ASP .NET Core Apps Optionally .NET Apps can utilize the built-in [ASP.NET CORS Support](https://learn.microsoft.com/en-us/aspnet/core/security/cors) with the mix in: :::sh npx add-in cors ::: This will add the `Configure.Cors.cs` [Modular Startup](/modular-startup) to your Host project which can be further customized to support your use-case: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureCors))] namespace MyApp; public class ConfigureCors : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddCors(options => { options.AddDefaultPolicy(policy => { policy.WithOrigins([ "http://localhost:5000", "https://localhost:5001", "http://localhost:8080", "https://localhost:5173", "http://localhost:5173", ]) .AllowCredentials() .WithHeaders(["Content-Type", "Allow", "Authorization"]) .SetPreflightMaxAge(TimeSpan.FromHours(1)); }); }); services.AddTransient<IStartupFilter, StartupFilter>(); }); public class StartupFilter : IStartupFilter { public Action<IApplicationBuilder> Configure(Action<IApplicationBuilder> next) => app => { app.UseCors(); next(app); }; } } ``` ### CORS Feature Plugin The **CorsFeature** plugin supports all ServiceStack Hosts which wraps CORS headers into an encapsulated [Plugin][1] to make it much easier to add CORS support to your ServiceStack services. Commonly this is now all that's needed: ```csharp Plugins.Add(new CorsFeature()); ``` Which uses the default values: ```csharp CorsFeature(allowedOrigins:"*", allowedMethods:"GET, POST, PUT, DELETE, OPTIONS", allowedHeaders:"Content-Type", allowCredentials:false); ``` You can leave out any of the values matching the default. E.g. if you just wanted to restrict the allowed methods to just GET and POST requests, you can just do: ```csharp Plugins.Add(CorsFeature(allowedMethods:"GET, POST")); ``` ### Allow specific origins Use `allowOriginWhitelist` when you want to only allow CORS access by specific sites: ```csharp Plugins.Add(new CorsFeature( allowOriginWhitelist: new[] { "http://localhost","http://localhost:5000","http://run.plnkr.co" }, allowCredentials: true, allowedHeaders: "Content-Type, Allow, Authorization, X-Args")); ``` ### Enabling CORS per-service support Instead of using the plugin above, ServiceStack also allows you to enable CORS on a per-service basis by using **[EnableCors]** [Response Filter attribute][2] which has the same defaults as above. E.g. You can enable just GET, POST as above with: ```csharp [EnableCors(allowedMethods:"GET,POST")] public class MyService : Service { ... } ``` ## Manually enabling CORS The beauty of [ServiceStack][3] is that it's built on a highly flexible and simple core. We don't believe in building strong-typed APIs over everything, as it's impossible to predict what new HTTP Headers / StatusCodes will exist in the future. So whilst we provide convenient behavior to accomplish common tasks, we also provide a flexible API that lets you configure any desired HTTP Output. ### Setting Global HTTP Headers This is how to globally enable Cross Origin Sharing in you AppHost config: ```csharp public override void Configure(Container container) { //Permit modern browsers (e.g. Firefox) to allow sending of any HTTP Method SetConfig(new HostConfig { GlobalResponseHeaders = { { "Access-Control-Allow-Origin", "*" }, { "Access-Control-Allow-Methods", "GET, POST, PUT, DELETE, OPTIONS" }, { "Access-Control-Allow-Headers", "Content-Type" }, }, }); } ``` ### Returning Custom HTTP Headers in a service These headers will get sent on every request, alternatively you can also enable it for specific web services, i.e. take the [Hello World Web Service][4] for example: ```csharp public class Hello { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } public class HelloService : IService { public object Any(Hello request) { var dto = new HelloResponse { Result = "Hello, " + request.Name }; return new HttpResult(dto) { Headers = { { "Access-Control-Allow-Origin", "*" }, { "Access-Control-Allow-Methods", "GET, POST, PUT, DELETE" } { "Access-Control-Allow-Headers", "Content-Type" }, } }; } } ``` The above is all the C# code you need to develop a web service which is then automatically wired up for you on all HTTP Verbs (GET, POST, etc) and built-in endpoints, i.e. JSON, XML, JSV, HTML, CSV, SOAP 1.1/1.2 - for free, without any config or friction required. Checkout [the live example of the above web service][5]. ## JSONP In addition to the above endpoints each service is available to be called by [JSONP](https://en.wikipedia.org/wiki/JSONP) (another popular way to enable cross-domain service calls in Ajax apps) where each service can be called via JSONP by simply adding the **?callback=cb** parameter to the querystring, e.g: [techstacks.io/technology/servicestack?callback=cb](https://techstacks.io/technology/servicestack?callback=cb): ```js //Response: cb({ ... }) ``` [1]: /plugins [2]: /filter-attributes [3]: http://www.servicestack.net [4]: http://www.servicestack.net/ServiceStack.Hello/ [5]: http://www.servicestack.net/ServiceStack.Hello/ [6]: http://stackoverflow.com/questions/6245616/does-servicestack-support-binary-responses [7]: http://www.servicestack.net/benchmarks/ --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/create-webservice-from-scratch.md --- slug: create-webservice-from-scratch title: Creating a WebService from scratch --- ## Step 1: Create an application ServiceStack can be hosted in a few ways: console application, windows service, ASP.NET Web Form or MVC in IIS, etc. For this tutorial, an empty ASP.NET Web Application (non MVC) is assumed. ## Step 2: Install ServiceStack To install ServiceStack into your application, you have two options to get the binaries: :::copy `<PackageReference Include="ServiceStack" Version="5.12.0" />` ::: ::: info Tip You can find an explanation about all NuGet packages which ServiceStack offers [here](/nuget). The package above only adds the binaries, but there also exist some packages which add the required configurations etc ::: ### Register ServiceStack Handler After you've added the binaries, you need to register ServiceStack in `web.config`: If you want to host ServiceStack at root path (`/`), you should use this configuration: ```xml <!-- For IIS 6.0/Mono --> <system.web> <httpHandlers> <add path="*" type="ServiceStack.HttpHandlerFactory, ServiceStack" verb="*"/> </httpHandlers> </system.web> <!-- For IIS 7.0+ --> <system.webServer> <validation validateIntegratedModeConfiguration="false" /> <handlers> <add path="*" name="ServiceStack.Factory" preCondition="integratedMode" type="ServiceStack.HttpHandlerFactory, ServiceStack" verb="*" resourceType="Unspecified" allowPathInfo="true" /> </handlers> </system.webServer> ``` ::: info Tip If you want to host your webservice on a custom path to avoid conflicts with another web framework (eg ASP.Net MVC), see [Run ServiceStack side-by-side with another web framework](/servicestack-side-by-side-with-another-web-framework) ::: ::: warning Due to limitations in IIS 6 - host [ServiceStack at a /custompath](/mvc-integration#enabling-servicestack-in-webconfig) which must end with `.ashx`, e.g: `path="api.ashx"` ::: ## Step 3: Create your first webservice If `Global.asax.cs` doesn't already exist you have to add it manually. To do this **Right-click** on your project and go **Add -> New Item**, then select the **Global Application** class. Each service in ServiceStack consists of three parts: - Request DTO - Service implementation - Response DTO That's the core philosophy in ServiceStack. Each service has a strongly-typed, code-first (normal POCOs) Request DTO and response DTO. You can read a detailed explanation what advantages exist if you're using DTOs in the [ReadMe](https://github.com/ServiceStack/ServiceStack/blob/master/README.md) or in [Why should I use ServiceStack?] (/why-servicestack). 1) Create the name of your Web Service (i.e. the Request DTO) ```csharp [Route("/hello")] [Route("/hello/{Name}")] public class Hello { public string Name { get; set; } } ``` 2) Define what your Web Service will return (i.e. Response DTO) ```csharp public class HelloResponse { public string Result { get; set; } } ``` 3) Create your Web Service implementation ```csharp public class HelloService : Service { public object Any(Hello request) { return new HelloResponse { Result = "Hello, " + request.Name }; } } ``` ## Step 4: Registering your web services and starting your application The final step is to configure setup to tell ServiceStack where to find your web services. To do that, add this code to your `Global.asax.cs`: ```csharp public class Global : System.Web.HttpApplication { public class AppHost : AppHostBase { //Tell ServiceStack the name of your application and where to find your services public AppHost() : base("Hello Web Services", typeof(HelloService).Assembly) { } public override void Configure(Funq.Container container) { //register any dependencies your services use, e.g: //container.Register<ICacheClient>(new MemoryCacheClient()); } } //Initialize your application singleton protected void Application_Start(object sender, EventArgs e) { new AppHost().Init(); } } ``` Done! You now have a working application :) As you can see, you have created an `AppHost`. Mainly all configuration related to ServiceStack is made in the `AppHost`. It's the starting point in your application. #### Disable WebApi from the default MVC4 VS.NET template If you are using MVC4 then you need to comment line in global.asax.cs to disable WebApi ```cs //WebApiConfig.Register(GlobalConfiguration.Configuration); ``` ## ServiceStack is now Ready! Now that you have a working Web Service lets see what ServiceStack does for you out of the box: If everything is configured correctly you can go to `http://<root_path>/metadata` to see a list of your web services and the various end points its available on. ![Metadata page](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/wikis/metadata-chat.png) ::: info Tip In the screenshot the root path is `http://localhost/ServiceStack.Hello/servicestack`. On your development box the root path might be something like `http://localhost:60335` (ie the URL on which your webservice is hosted). ::: Let's access the HelloWorld service you created in your browser, so write the following URL in your address bar: ``` GET http://<root_path>/hello/YourName ``` > E.g. http://example.org/hello/Max As you can see after clicking on this link, ServiceStack also contains a HTML response format, which makes the XML/Json (...) output human-readable. To change the return format to Json, simply add `?format=json` to the end of the URL. You'll learn more about formats, endpoints (URLs, etc) when you continue reading the documentation. ## Troubleshooting If you happen to generate requests from the wsdls with a tool like soapUI you may end up with an incorrectly generated request like this: ```xml <soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:typ="http://schemas.servicestack.net/types"> <soap:Header/> <soap:Body> <typ:Hello/> </soap:Body> </soap:Envelope> ``` You can resolve this issue by adding the following line to your AssemblyInfo file ```csharp [assembly: ContractNamespace("http://schemas.servicestack.net/types", ClrNamespace = "<YOUR NAMESPACE>")] ``` Rebuild and regenerate the request from the updated wsdl. You should get a correct request this time. ```xml <soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:typ="http://schemas.servicestack.net/types"> <soap:Header/> <soap:Body> <typ:Hello> <!--Optional:--> <typ:Name>?</typ:Name> </typ:Hello> </soap:Body> </soap:Envelope> ``` ## Explore ServiceStack Documented Demo The [EmailContacts solution](https://github.com/ServiceStackApps/EmailContacts/) is a new guidance available that walks through the recommended setup and physical layout structure of typical medium-sized ServiceStack projects, including complete documentation of how to create the solution from scratch, whilst explaining all the ServiceStack features it makes use of along the way. # Community Resources - [Creating A Simple Service Using ServiceStack](http://shashijeevan.net/2015/09/20/creating-a-simple-service-using-servicestack/) by [Shashi Jeevan](http://shashijeevan.net/author/shashijeevan/) - [Introducing ServiceStack](http://www.dotnetcurry.com/showarticle.aspx?ID=1056) by [@dotnetcurry](https://twitter.com/DotNetCurry) - [Create web services in .NET in a snap with ServiceStack](http://www.techrepublic.com/article/create-web-services-in-net-in-a-snap-with-servicestack/) by [@techrepublic](https://twitter.com/techrepublic) - [How to build web services in MS.Net using ServiceStack](http://kborra.wordpress.com/2014/07/29/how-to-build-web-services-in-ms-net-using-service-stack/) by [@kishoreborra](http://kborra.wordpress.com/about/) - [Getting started with ServiceStack – Creating a service](http://dilanperera.wordpress.com/2014/02/22/getting-started-with-servicestack-creating-a-service/) - [Fantastic Step-by-step walk-thru into ServiceStack with Screenshots!](http://nilsnaegele.com/codeedge/servicestack.html) by [@nilsnagele](https://twitter.com/nilsnagele) - [Your first REST service with ServiceStack](http://tech.pro/tutorial/1148/your-first-rest-service-with-servicestack) by [@cyberzeddk](https://twitter.com/cyberzeddk) - [New course: Using ServiceStack to Build APIs](http://blog.pluralsight.com/2012/11/29/new-course-using-servicestack-to-build-apis/) by [@pluralsight](http://twitter.com/pluralsight) - [ServiceStack the way I like it](http://tonyonsoftware.blogspot.co.uk/2012/09/lessons-learned-whilst-using.html) by [@tonydenyer](https://twitter.com/tonydenyer) - [Generating a RESTful Api and UI from a database with LLBLGen](http://www.mattjcowan.com/funcoding/2013/03/10/rest-api-with-llblgen-and-servicestack/) by [@mattjcowan](https://twitter.com/mattjcowan) - [ServiceStack: Reusing DTOs](http://korneliuk.blogspot.com/2012/08/servicestack-reusing-dtos.html) by [@korneliuk](https://twitter.com/korneliuk) - [ServiceStack, Rest Service and EasyHttp](http://blogs.lessthandot.com/index.php/WebDev/ServerProgramming/servicestack-restservice-and-easyhttp) by [@chrissie1](https://twitter.com/chrissie1) - [Building a Web API in SharePoint 2010 with ServiceStack](http://www.mattjcowan.com/funcoding/2012/05/04/building-a-web-api-in-sharepoint-2010-with-servicestack) - [JQueryMobile and ServiceStack: EventsManager tutorial part #3](http://paymentnetworks.wordpress.com/2012/04/24/jquerymobile-and-service-stack-eventsmanager-tutorial-post-3/) by Kyle Hodgson - [REST Raiding. ServiceStack](http://dgondotnet.blogspot.de/2012/04/rest-raiding-servicestack.html) by [Daniel Gonzalez](http://www.blogger.com/profile/13468563783321963413) - [JQueryMobile and ServiceStack: EventsManager tutorial](http://kylehodgson.com/2012/04/21/jquerymobile-and-service-stack-eventsmanager-tutorial-post-2/) / [Part 3](http://kylehodgson.com/2012/04/23/jquerymobile-and-service-stack-eventsmanager-tutorial-post-3/) by Kyle Hodgson - [Like WCF: Only cleaner!](http://kylehodgson.com/2012/04/18/like-wcf-only-cleaner-9/) by Kyle Hodgson - [ServiceStack vs WCF Data Services](http://codealoc.wordpress.com/2012/03/24/service-stack-vs-wcf-data-services/) - [Building a Tridion WebService with jQuery and ServiceStack](http://www.curlette.com/?p=161) by [@robrtc](https://twitter.com/robrtc) - [Anonymous type + Dynamic + ServiceStack == Consuming cloud has never been easier](http://www.ienablemuch.com/2012/05/anonymous-type-dynamic-servicestack.html) by [@ienablemuch](https://twitter.com/ienablemuch) - [Handful of examples of using ServiceStack based on the ServiceStack.Hello Tutorial](https://github.com/jfoshee/TryServiceStack) by [@82unpluggd](https://twitter.com/82unpluggd) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/create-your-first-webservice.md --- slug: create-your-first-webservice title: Create your first WebService --- This is a quick walkthrough of getting your first web service up and running whilst having a look at the how some of the different components work. ## Step 1: Install the x dotnet tool First we want to install the [x dotnet tool](/dotnet-tool): :::sh dotnet tool install --global x ::: The [dotnet tools](/dotnet-tool) are ServiceStack's versatile companion giving you quick access to a lot of its high-level features including generating mobile, web & desktop DTOs with [Add ServiceStack Reference](/add-servicestack-reference) generating [gRPC Clients and proto messages](/grpc/), quickly [apply gists](/mix-tool) to your project enabled by ServiceStack's effortless [no-touch Modular features](/modular-startup), [command-line API access](/post-command), it even includes a [lisp REPL](https://sharpscript.net/lisp/) should you need to explore your [remote .NET Apps in real-time](https://sharpscript.net/lisp/#techstacks-tcp-lisp-repl-demo). ## Step 2: Selecting a template Importantly, the dotnet tools lets you create [.NET 10, .NET Framework](/dotnet-new) and [ASP.NET Core on .NET Framework](/templates/corefx) projects. Unless you're restricted to working with .NET Framework you'll want to start with a [.NET 10 project template](/templates/dotnet-new#usage), for this example we'll start with the Empty [web](https://github.com/NetCoreTemplates/web) template which implicitly uses the folder name for the Project Name: :::sh npx create-net web WebApp ::: ## Step 3: Run your project Press `Ctrl+F5` to run your project! You should see an already working API integration using [@servicestack/client](/javascript-client) library to call your App's [JavaScript DTOs](/javascript-add-servicestack-reference) and links to calling your API from [API Explorer](/api-explorer): <a href="https://web.web-templates.io"><img class="max-w-lg" src="/img/pages/overview/web-hello.png"></a> #### Watched builds A recommended alternative to running your project from your IDE is to run a watched build using `dotnet watch` from a terminal: :::sh dotnet watch ::: Where it will automatically rebuild & restart your App when it detects any changes to your App's source files. ### How does it work? Now that your new project is running, let's have a look at what we have. The template comes with a single web service route which comes from the Request DTO (Data Transfer Object) which is located in the [Hello.cs](https://github.com/NetCoreTemplates/web/blob/master/MyApp.ServiceModel/Hello.cs) file: ```csharp [Route("/hello/{Name}")] public class Hello : IReturn<HelloResponse> { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` The `Route` attribute is specifying what path `/hello/{Name}` where `{Name}` binds its value to the public string property of **Name**. Let's access the route to see what comes back. Go to the following URL in your address bar: /hello/world You will see a snapshot of the Result in a HTML response format. To change the return format to Json, simply add `?format=json` to the end of the URL. You'll learn more about [formats](/formats), endpoints (URLs, etc) when you continue reading the documentation. If we go back to the solution and find the WebApplication1.ServiceInterface and open the **MyServices.cs** file, we can have a look at the code that is responding to the browser, giving us the **Result** back. ```csharp public class MyServices : Service { public object Any(Hello request) { return new HelloResponse { Result = $"Hello, {request.Name}!" }; } } ``` If we look at the code above, there are a few things to note. The name of the method `Any` means the server will run this method for any of the valid HTTP Verbs. Service methods are where you control what returns from your service. ## Step 4: Exploring the ServiceStack Solution The Recommended structure below is built into all ServiceStackVS VS.NET Templates where creating any new ServiceStack project will create a solution with a minimum of 4 projects below ensuring ServiceStack solutions starts off from an optimal logical project layout, laying the foundation for growing into a more maintainable, cohesive and reusable code-base: <img align="right" src="/img/pages/solution-layout.png" /> ### Host Project The Host project contains your AppHost which references and registers all your App's concrete dependencies in its IOC and is the central location where all App configuration and global behavior is maintained. It also references all Web Assets like Razor Views, JS, CSS, Images, Fonts, etc. that's needed to be deployed with the App. The AppHost is the top-level project which references all dependencies used by your App whose role is akin to an orchestrator and conduit where it decides what functionality is made available and which concrete implementations are used. By design it references all other (non-test) projects whilst nothing references it and as a goal should be kept free of any App or Business logic. ### ServiceInterface Project The ServiceInterface project is the implementation project where all Business Logic and Services live which typically references every other project except the Host projects. Small and Medium projects can maintain all their implementation here where logic can be grouped under feature folders. Large solutions can split this project into more manageable cohesive and modular projects which we also recommend encapsulates any dependencies they might use. ### ServiceModel Project The ServiceModel Project contains all your Application's DTOs which is what defines your Services contract, keeping them isolated from any Server implementation is how your Service is able to encapsulate its capabilities and make them available behind a remote facade. There should be only one ServiceModel project per solution which contains all your DTOs and should be implementation, dependency and logic-free which should only reference the impl/dep-free **ServiceStack.Interfaces.dll** contract assembly to ensure Service contracts are decoupled from its implementation, enforces interoperability ensuring that your Services don't mandate specific client implementations and will ensure this is the only project clients need to be able to call any of your Services by either referencing the **ServiceModel.dll** directly or downloading the DTOs from a remote ServiceStack instance using [Add ServiceStack Reference](/add-servicestack-reference): ![](/img/pages/dtos-role.png) ### Test Project The Unit Test project contains all your Unit and Integration tests. It's also a Host project that typically references all other non-Host projects in the solution and contains a combination of concrete and mock dependencies depending on what's being tested. See the [Testing Docs](/testing) for more information on testing ServiceStack projects. ## Learn ServiceStack Guide If you're new to ServiceStack we recommend stepping through [ServiceStack's Getting Started Guide](https://servicestack.net/start/project-overview) to get familiar with the basics. ## API Client Examples ### jQuery Ajax ServiceStack's clean Web Services makes it simple and intuitive to be able to call ServiceStack Services from any ajax client, e.g. from a traditional [Bootstrap Website using jQuery](https://github.com/ServiceStack/Templates/blob/master/src/ServiceStackVS/BootstrapWebApp/BootstrapWebApp/default.cshtml): ```html <input class="form-control" id="Name" type="text" placeholder="Type your name"> <p id="result"></p> <script> $('#Name').keyup(function () { let name = $(this).val() $.getJSON('/hello/' + name) .success(function (response) { $('#result').html(response.Result) }) }) </script> ``` ### Rich JsonApiClient & Typed DTOs The modern recommended alternative to jQuery that works in all modern browsers is using your APIs built-in [JavaScript typed DTOs](/javascript-add-servicestack-reference) with the [@servicestack/client](/javascript-client) library from a [JavaScript Module](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules). We recommend using an [importmap](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script/type/importmap) to specify where **@servicestack/client** should be loaded from, e.g: ```html <script async src="https://ga.jspm.io/npm:es-module-shims@1.6.3/dist/es-module-shims.js"></script><!--safari--> <script type="importmap"> { "imports": { "@servicestack/client":"https://unpkg.com/@servicestack/client@2/dist/servicestack-client.mjs" } } </script> ``` This lets us reference the **@servicestack/client** package name in our source code instead of its physical location: ```html <input type="text" id="txtName"> <div id="result"></div> ``` ```html <script type="module"> import { JsonServiceClient, $1, on } from '@servicestack/client' import { Hello } from '/types/mjs' const client = new JsonServiceClient() on('#txtName', { async keyup(el) { const api = await client.api(new Hello({ name:el.target.value })) $1('#result').innerHTML = api.response.result } }) </script> ``` ### Enable static analysis and intelli-sense For better IDE intelli-sense during development, save the annotated Typed DTOs to disk with the [x dotnet tool](/dotnet-tool): :::sh x mjs ::: Then reference it instead to enable IDE static analysis when calling Typed APIs from JavaScript: ```js import { Hello } from '/js/dtos.mjs' client.api(new Hello({ name })) ``` To also enable static analysis for **@servicestack/client**, install the dependency-free library as a dev dependency: :::sh npm install -D @servicestack/client ::: Where only its TypeScript definitions are used by the IDE during development to enable its type-checking and intelli-sense. ### Rich intelli-sense support Where you'll be able to benefit from rich intelli-sense support in smart IDEs like [Rider](https://www.jetbrains.com/rider/) for both the client library: ![](/img/pages/mix/init-rider-ts-client.png) As well as your App's server generated DTOs: ![](/img/pages/release-notes/v6.6/mjs-intellisense.png) So even simple Apps without complex bundling solutions or external dependencies can still benefit from a rich typed authoring experience without any additional build time or tooling complexity. ## Create Empty ServiceStack Apps ::include empty-projects.md:: ### Any TypeScript or JavaScript Web, Node.js or React Native App The same TypeScript [JsonServiceClient](/javascript-client) can also be used in more sophisticated JavaScript Apps like [React Native](/typescript-add-servicestack-reference#react-native-jsonserviceclient) to [Node.js Server Apps](https://github.com/ServiceStackApps/typescript-server-events) such as this example using TypeScript & [Vue Single-File Components](https://vuejs.org/guide/scaling-up/sfc.html): ```html <template> <div v-if="api.error" class="ml-2 text-red-500">{{ error.message }}</div> <div v-else class="ml-3 mt-2 text-2xl">{{ api.loading ? 'Loading...' : api.response.result }}</div> </template> <script setup lang="ts"> import { JsonServiceClient } from "@servicestack/client" import { Hello } from "@/dtos" const props = defineProps<{ name: string }>() const client = new JsonServiceClient() const api = client.api(new Hello({ name: props.name })) </script> ``` Compare and contrast with other major SPA JavaScript Frameworks: - [Vue 3 HelloApi.mjs](https://github.com/NetCoreTemplates/blazor-vue/blob/main/MyApp/wwwroot/posts/components/HelloApi.mjs) - [Vue HelloApi.vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/_posts/components/HelloApi.vue) - [Next.js with swrClient](https://github.com/NetCoreTemplates/nextjs/blob/main/ui/components/intro.tsx) - [React HelloApi.tsx](https://github.com/NetCoreTemplates/react-spa/blob/master/MyApp/src/components/Home/HelloApi.tsx) - [Angular HelloApi.ts](https://github.com/NetCoreTemplates/angular-spa/blob/master/MyApp/src/app/home/HelloApi.ts) ### Web, Mobile and Desktop Apps Use [Add ServiceStack Reference](/add-servicestack-reference) to enable typed integrations for the most popular languages to develop Web, Mobile & Desktop Apps. ### Full .NET Project Templates The above `init` projects allow you to create a minimal web app, to create a more complete ServiceStack App with the recommended project structure, start with one of our C# project templates instead: ### [C# Project Templates Overview](/templates/) ## Simple, Modern Razor Pages & MVC Vue 3 Tailwind Templates The new Tailwind Razor Pages & MVC Templates enable rapid development of Modern Tailwind Apps without the [pitfalls plaguing SPAs](https://servicestack.net/posts/javascript): <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="SyppvQB7IPs" style="background-image: url('https://img.youtube.com/vi/SyppvQB7IPs/maxresdefault.jpg')"></lite-youtube> All Vue Tailwind templates are pre-configured with our rich [Vue 3 Tailwind Components](/vue/) library for maximum productivity: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="YIa0w6whe2U" style="background-image: url('https://img.youtube.com/vi/YIa0w6whe2U/maxresdefault.jpg')"></lite-youtube> ## Advanced JAMStack Templates For more sophisticated Apps that need the best web tooling that npm can offer checkout our JAMStack Vite Vue & SSG templates: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="D-rU0lU_B4I" style="background-image: url('https://img.youtube.com/vi/D-rU0lU_B4I/maxresdefault.jpg')"></lite-youtube> Or if you prefer Modern React Apps checkout the Next.js template: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="3pPLRyPsO5A" style="background-image: url('https://img.youtube.com/vi/3pPLRyPsO5A/maxresdefault.jpg')"></lite-youtube> For Blazor WASM and Server checkout our comprehensive [Blazor projects & Tailwind components](/templates/blazor-tailwind). ### Integrated in Major IDEs and popular Mobile & Desktop platforms ServiceStack Services are also [easily consumable from all major Mobile and Desktop platforms](/why-servicestack#generate-instant-typed-apis-from-within-all-major-ides) including native iPhone and iPad Apps on iOS with Swift, Mobile and Tablet Apps on Android with Java or Kotlin, OSX Desktop Applications as well as targeting the most popular .NET Mobile and Desktop platforms including Xamarin.iOS, Xamarin.Android, Windows Store, WPF and WinForms. ## Instant Client Apps Generate working native client apps for your live ServiceStack services, in a variety of languages, instantly with our free managed service. This tool enables your developers, and even your customers, to open a working example native application straight from the web to their favorite IDE. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="GTnuMhvUayg" style="background-image: url('https://img.youtube.com/vi/GTnuMhvUayg/maxresdefault.jpg')"></lite-youtube> ## Fundamentals - AppHost and Configuration Walk through configuring your ServiceStack Application's `AppHost`: <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="mOpx5mUGoqI" style="background-image: url('https://img.youtube.com/vi/mOpx5mUGoqI/maxresdefault.jpg')"></lite-youtube> ## Community Resources - [Creating A Simple Service Using ServiceStack](https://www.c-sharpcorner.com/UploadFile/shashijeevan/creating-a-simple-service-using-servicestack779/) by [Shashi Jeevan](http://shashijeevan.net/author/shashijeevan/) - [Introducing ServiceStack](https://www.dotnetcurry.com/aspnet/1056/introducing-service-stack-tutorial) by [@dotnetcurry](https://twitter.com/DotNetCurry) - [Create web services in .NET in a snap with ServiceStack](https://www.techrepublic.com/article/create-web-services-in-net-in-a-snap-with-servicestack/) by [@techrepublic](https://twitter.com/techrepublic) - [How to build web services in MS.Net using ServiceStack](https://kborra.wordpress.com/2014/07/29/how-to-build-web-services-in-ms-net-using-service-stack/) by [@kishoreborra](http://kborra.wordpress.com/about/) - [Getting started with ServiceStack – Creating a service](https://dilanperera.wordpress.com/2014/02/22/getting-started-with-servicestack-creating-a-service/) - [ServiceStack Quick Start](https://debuggers.domains/post/servicestack-quick-start/) by [@aarondandy](https://github.com/aarondandy) - [Getting Started with ASP.NET MVC, ServiceStack and Bootstrap](https://www.pluralsight.com/courses/getting-started-aspdotnet-mvcservice-stack-bootstrap) by [@pluralsight](http://twitter.com/pluralsight) - [Building Web Applications with Open-Source Software on Windows](https://www.pluralsight.com/courses/building-web-application-open-source-software-on-windows) by [@pluralsight](http://twitter.com/pluralsight) - [ServiceStack the way I like it](https://www.antonydenyer.co.uk/2012-09-20-servicestack-the-way-i-like-it/) by [@tonydenyer](https://twitter.com/tonydenyer) - [Generating a RESTful Api and UI from a database with LLBLGen](https://www.mattjcowan.com/funcoding/2013/03/10/rest-api-with-llblgen-and-servicestack/) by [@mattjcowan](https://twitter.com/mattjcowan) - [ServiceStack: Reusing DTOs](https://korneliuk.blogspot.com/2012/08/servicestack-reusing-dtos.html) by [@korneliuk](https://twitter.com/korneliuk) - [ServiceStack, Rest Service and EasyHttp](https://blogs.lessthandot.com/index.php/WebDev/ServerProgramming/servicestack-restservice-and-easyhttp) by [@chrissie1](https://twitter.com/chrissie1) - [Building a Web API in SharePoint 2010 with ServiceStack](https://www.mattjcowan.com/funcoding/2012/05/04/building-a-web-api-in-sharepoint-2010-with-servicestack/) - [REST Raiding. ServiceStack](https://dgondotnet.blogspot.com/2012/04/rest-raiding-servicestack.html) by [Daniel Gonzalez](http://www.blogger.com/profile/13468563783321963413) - [JQueryMobile and ServiceStack: EventsManager tutorial](https://kylehodgson.com/2012/04/21/jquerymobile-and-service-stack-eventsmanager-tutorial-post-2/) / [Part 3](https://kylehodgson.com/2012/04/23/jquerymobile-and-service-stack-eventsmanager-tutorial-post-3/) by Kyle Hodgson - [Like WCF: Only cleaner!](https://kylehodgson.com/2012/04/18/like-wcf-only-cleaner-9/) by Kyle Hodgson - [ServiceStack I heart you. My conversion from WCF to SS](https://www.philliphaydon.com/2012/02/21/service-stack-i-heart-you-my-conversion-from-wcf-to-ss/) by [@philliphaydon](https://twitter.com/philliphaydon) - [ServiceStack vs WCF Data Services](https://codealoc.wordpress.com/2012/03/24/service-stack-vs-wcf-data-services/) - [Buildiing a Tridion WebService with jQuery and ServiceStack](https://www.curlette.com/?p=161) by [@robrtc](https://twitter.com/#!/robrtc) - [Anonymous type + Dynamic + ServiceStack == Consuming cloud has never been easier](https://www.ienablemuch.com/2012/05/anonymous-type-dynamic-servicestack.html) by [@ienablemuch](https://twitter.com/ienablemuch) - [Handful of examples of using ServiceStack based on the ServiceStack.Hello Tutorial](https://github.com/jfoshee/TryServiceStack) by [@82unpluggd](https://twitter.com/82unpluggd) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/autoquery/crud.md --- title: AutoQuery CRUD --- AutoQuery Services includes support for implementing much of a CRUD Services logic declaratively, including support for multi-tenancy, optimistic concurrency, declarative validation, Auto Mapping external of Request/Respond DTOs to data model properties, auto populating then using full #Script Expressions that can be used for example to populate timestamps, authenticating user information, generating new UUIDs, etc. Just like AutoQuery, CRUD Services are ServiceStack Services where you can continue using the same functionality to specify optimal user-defined routes for HTTP APIs, same Request/Response and Attribute filters to apply custom logic and continue enjoying the entire ecosystem around ServiceStack Services including being able to invoke them via [gRPC](/grpc/), [MQ endpoints](/messaging) and its rich client ecosystem for enabling end-to-end Typed APIs with [Add ServiceStack Reference](/add-servicestack-reference). AutoQuery Services are fast & emit clean optimal "pure serialized POCO" wire-format, they're built on OrmLite's high-performance APIs where all AutoQuery APIs are `async` by default but also also offers native sync APIs if needing to enlist any of AutoQuery's functionality in custom sync methods (that are unable to be converted into viral async APIs). Importantly AutoQuery Services are "future-proofed" and can be overridden with a custom implementation that can either choose to augment the existing AutoQuery functionality and enhance it with custom behavior (e.g. if not possible to implement declaratively) or if needed its entire implementation can be replaced without breaking its design contract & existing client integrations, should it be necessary to reimplement later if the Service needs to be constructed to use alternative data sources. ## Rapidly develop data-driven systems As AutoQuery lets you declaratively develop Services by just defining their API Contract with POCO DTOs you're able to develop entire data-driven systems in a fraction of the time that it would take to implement them manually. In addition AutoQuery Services are semantically richer as all capabilities are declaratively defined around typed data models which makes it possible to build higher-level generic features like ServiceStack's Studio [Instant UI for AutoQuery Services](/studio-autoquery). With AutoQuery you can now build entire Apps declaratively to develop high-performance capable Services accessible via ServiceStack's industry leading [myriad of Service endpoints](/why-servicestack#features-overview) and rich metadata services, all without needing to write any implementation! For a sample of the productivity enabled checkout the [Bookings CRUD](https://github.com/NetCoreApps/BookingsCrud) demo to create a multi-user ASP.NET Core Booking System from scratch within minutes with full Audit History, fine-grained permissions, declarative validation, run adhoc queries & export to Excel by just defining code-first high-performance AutoQuery CRUD Typed APIs <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="rSFiikDjGos" style="background-image: url('https://img.youtube.com/vi/rSFiikDjGos/maxresdefault.jpg')"></lite-youtube> ## Creating AutoQuery CRUD Services Just like [AutoQuery](/autoquery/rdbms), you just need to provide the typed Request DTOs definition for your DB Table APIs and AutoQuery automatically provides the implementation for the Service. To enlist Auto CRUD behavior your Request DTOs need to implement one of the following interfaces which dictates the behavior of the Service: - `ICreateDb<Table>` - Create new Table Entry - `IUpdateDb<Table>` - Update existing Table Entry - `IPatchDb<Table>` - Partially update existing Table Entry - `IDeleteDb<Table>` - Delete existing Table Entry All Request DTOs also require either an `IReturn<T>` or `IReturnVoid` marker interface to specify the return type of the Service. ::: info Can use built-in `IReturn<EmptyResponse>` for an "empty" response where as `IReturnVoid` returns "no" response. ::: Let's go through a simple example, starting with a simple POCO OrmLite data model we want to add to our RDBMS: ```csharp public class Rockstar { [AutoIncrement] public int Id { get; set; } public string FirstName { get; set; } public string LastName { get; set; } public int? Age { get; set; } public DateTime DateOfBirth { get; set; } public DateTime? DateDied { get; set; } public LivingStatus LivingStatus { get; set; } } ``` We can create a Service that **inserts** a new `Rockstar` by defining all the properties we want to allow API consumers to provide when creating a new Rockstar: ```csharp public class CreateRockstar : ICreateDb<Rockstar>, IReturn<CreateRockstarResponse> { public string FirstName { get; set; } public string LastName { get; set; } public int? Age { get; set; } public DateTime DateOfBirth { get; set; } } public class CreateRockstarResponse { public int Id { get; set; } // Id is auto populated with RDBMS generated Id public ResponseStatus ResponseStatus { get; set; } } ``` When ServiceStack starts it generates the implementation for this Service, which can now insert Rockstars using your populated Request DTO: ```csharp var client = new JsonApiClient(baseUrl); client.Post(new CreateRockstar { FirstName = "Kurt", LastName = "Cobain", Age = 27, DateOfBirth = new DateTime(20,2,1967), }); ``` Similarly you can define **Update** and **Delete** Services the same way, e.g: ```csharp public class UpdateRockstar : Rockstar, IUpdateDb<Rockstar>, IReturn<UpdateRockstarResponse> {} public class UpdateRockstarResponse { public int Id { get; set; } // Id is auto populated with RDBMS generated Id public Rockstar Result { get; set; } // selects & returns latest DB Rockstar public ResponseStatus ResponseStatus { get; set; } } ``` By convention if your Response DTO contains any of these properties it will be automatically populated: - `T Id` - The Primary Key - `T Result` - The POCO you want to return (can be a subset of DB model) - `int Count` - Return the number of rows affected (typically 1, but Deletes can be >1) Delete Services need only a Primary Key, e.g: ```csharp public class DeleteRockstar : IDeleteDb<Rockstar>, IReturnVoid { public int Id { get; set; } } ``` and to Query the Rockstar table you have the [full featureset of AutoQuery](/autoquery/rdbms) for a complete set of CRUD Services without needing to provide any implementations. ## Custom AutoQuery CRUD Implementation Just as you can create [Custom AutoQuery Implementations](/autoquery/rdbms#custom-autoquery-implementations) to override the default AutoQuery behavior you can also override AutoQuery CRUD implementations by creating implementations with AutoQuery CRUD Request DTOs and calling the relevate `IAutoQueryDb` method, e.g: ```csharp public class MyCrudServices(IAutoQueryDb autoQuery) : Service { public object Post(CreateRockstar request) => autoQuery.Create(request, base.Request); public object Put(UpdateRockstar request) => autoQuery.Update(request, base.Request); public object Delete(DeleteRockstar request) => autoQuery.Delete(request, base.Request); } // Async public class MyCrudServices(IAutoQueryDb autoQuery) : Service { public Task<object> Post(CreateRockstar request) => autoQuery.CreateAsync(request, base.Request); public Task<object> Put(UpdateRockstar request) => autoQuery.UpdateAsync(request, base.Request); public Task<object> Delete(DeleteRockstar request) => autoQuery.DeleteAsync(request, base.Request); } ``` ### Custom implementations using OrmLite Typed APIs It's not strictly necessary to use `IAutoQueryDb` APIs to implement custom AutoQuery implementations as you could instead use OrmLite to implement similar CRUD behavior, e.g: ```csharp public class MyCrudServices : Service { public object Post(CreateRockstar request) { var id = (int) Db.Insert(request.ConvertTo<Rockstar>(), selectIdentity:true); return new CreateRockstarResponse { Id = id }; } public object Put(UpdateRockstar request) { Db.UpdateNonDefaults(request.ConvertTo<Rockstar>(), x => x.Id == request.Id); return new UpdateRockstarResponse { Id = id, Result = Db.SingleById<Rockstar>(id), }; } public void Delete(DeleteRockstar request) { Db.DeleteById<Rockstar>(request.Id); } } ``` Async version: ```csharp public class MyCrudServices : Service { public async Task<object> Post(CreateRockstar request) { var id = (int) await Db.InsertAsync(request.ConvertTo<Rockstar>(), selectIdentity:true); return new CreateRockstarResponse { Id = id }; } public object Put(UpdateRockstar request) { await Db.UpdateNonDefaultsAsync(request.ConvertTo<Rockstar>(), x => x.Id == request.Id); return new UpdateRockstarResponse { Id = id, Result = await Db.SingleByIdAsync<Rockstar>(id), }; } public Task Delete(DeleteRockstar request) { await Db.DeleteByIdAsync<Rockstar>(request.Id); } } ``` The above are equivalents of typical AutoQuery CRUD APIs using OrmLite directly, however if the AutoQuery APIs includes [POCO references](/ormlite/reference-support), you'll need to OrmLite's `Save()` API to save the reference complex types as well, e.g: ```csharp public class MyCrudServices : Service { public object Post(CreateRockstar request) { var row = request.ConvertTo<Rockstar>(); Db.Save(row, references: true); return new CreateRockstarResponse { Id = row.Id }; } } ``` ## AutoQuery CRUD Attributes AutoQuery CRUD extends existing [querying functionality in AutoQuery](/autoquery/rdbms) with additional features covering common functionality in CRUD operations: - `[AutoApply]` - Apply built-in composite generic behavior - `[AutoPopulate]` - Populate data models with generic user & system info - `[AutoFilter]` - Apply pre-configured filters to query operations - `[AutoMap]` - Map System Input properties to Data Model fields - `[AutoDefault]` - Specify to fallback default values when not provided - `[AutoIgnore]` - Ignore mapping Request DTO property to Data Model Each of these are covered in more detail in the docs and examples below. ### Advanced CRUD Example Lets now explore a more advanced example that implements Audit information as well as layered support for multi-tenancy to see how you can easily compose features. So lets say you have an interface that all tables you want to contain Audit information implements: ```csharp public interface IAudit { DateTime CreatedDate { get; set; } string CreatedBy { get; set; } string CreatedInfo { get; set; } DateTime ModifiedDate { get; set; } string ModifiedBy { get; set; } string ModifiedInfo { get; set; } DateTime? SoftDeletedDate { get; set; } string SoftDeletedBy { get; set; } string SoftDeletedInfo { get; set; } } ``` It's not required, but it's also useful to have a concrete base table which could be annotated like: ```csharp public abstract class AuditBase : IAudit { public DateTime CreatedDate { get; set; } [Required] public string CreatedBy { get; set; } [Required] public string CreatedInfo { get; set; } public DateTime ModifiedDate { get; set; } [Required] public string ModifiedBy { get; set; } [Required] public string ModifiedInfo { get; set; } [Index] //Check if Deleted public DateTime? SoftDeletedDate { get; set; } public string SoftDeletedBy { get; set; } public string SoftDeletedInfo { get; set; } } ``` #### AutoPopulate Examples We can then create a base Request DTO that all Audit Create Services will implement: ```csharp [ValidateIsAuthenticated] [AutoPopulate(nameof(IAudit.CreatedDate), Eval = "utcNow")] [AutoPopulate(nameof(IAudit.CreatedBy), Eval = "userAuthName")] //or userAuthId [AutoPopulate(nameof(IAudit.CreatedInfo), Eval = "`${userSession.DisplayName} (${userSession.City})`")] [AutoPopulate(nameof(IAudit.ModifiedDate), Eval = "utcNow")] [AutoPopulate(nameof(IAudit.ModifiedBy), Eval = "userAuthName")] //or userAuthId [AutoPopulate(nameof(IAudit.ModifiedInfo), Eval = "`${userSession.DisplayName} (${userSession.City})`")] public abstract class CreateAuditBase<Table,TResponse> : ICreateDb<Table>, IReturn<TResponse> {} ``` The `*Info` examples is a superfluous example showing that you can evaluate any `#Script` expression. Typically you'd only save User Id or Username. These all call [#Script Methods](https://sharpscript.net/docs/methods) which you can [add/extend yourself](https://sharpscript.net/docs/script-pages#extend), e.g: ```csharp public override void Configure() { // Register custom script methods ScriptContext.ScriptMethods.Add(new MyScripts()); } // Custom #Script Methods, see: https://sharpscript.net/docs/methods public class MyScripts : ScriptMethods { public string tenantId(ScriptScopeContext scope) { var req = scope.GetRequest(); var requestDto = req.Dto; return requestDto is IHasTenantId hasTenantId ? hasTenantId.TenantId // Explicitly set on Request DTO : req.AbsoluteUri.RightPart("//").LeftPart('.'); //Fallback to use subdomain } } // Populate Post.TenantId property with `tenantId` #Script Method [AutoPopulate(nameof(Post.TenantId), Eval = "tenantId")] public class CreatePost : ICreatDb<Post>, IReturn<IdResponse> { public string Name { get; set; } public string Content { get; set; } } ``` ### AutoPopulate The `[AutoPopulate]` attribute tells AutoCrud that you want the DB Table to automatically populate these properties, which can be populated using any of its properties below: - **Value** - A constant value that can be used in C# Attributes, e.g `Value="Foo"` - **Expression** - A Lightweight [#Script](https://sharpscript.net/) Expression that results in a constant value that's only evaluated once and cached globally, e.g. `Expression = "date(2001,1,1)"`, useful for values that can't be defined in C# Attributes like `DateTime`, can be any [#Script Method](https://sharpscript.net/docs/default-scripts). - **Eval** - A [#Script](https://sharpscript.net/) Expression that's cached per request. E.g. `Eval="utcNow"` calls the `utcNow` Script method which returns `DateTime.UtcNow` which is cached for that request so all other `utcNow` expressions will return the same exact value. - **NoCache** - Don't cache the expression, evaluate it each time. AutoCrud makes extensive usage of `#Script` expressions for much of its declarative functionality which always executes their cached ASTs so expressions are only parsed once and still fast to evaluate even when results are not cached. Lets now layer on additional generic functionality by inheriting and extending the base class with additional functionality, e.g. if we want our table to support [Multitenancy](/multitenancy) we could extend it with: ```csharp [AutoPopulate(nameof(IAuditTenant.TenantId), Eval = "Request.Items.TenantId")] public abstract class CreateAuditTenantBase<Table,TResponse> : CreateAuditBase<Table,TResponse> {} ``` Where `TenantId` is added in a Global Request Filter (e.g. after inspecting the authenticated UserSession to determine the tenant they belong to), e.g: ```csharp const string TenantId = nameof(TenantId); void SetTenant(IRequest req, IResponse res, object dto) { var userSession = req.SessionAs<AuthUserSession>(); if (userSession.IsAuthenticated) { req.SetItem(TenantId, userSession.City switch { "London" => 10, "Perth" => 11, //... _ => 100, }); } } GlobalRequestFilters.Add(SetTenant); // HTTP Requests GlobalMessageRequestFilters.Add(SetTenant); // MQ Requests ``` Now we easily implement custom "Audited" and "Multi Tenant" CRUD Services by inheriting these base Services. Here's an example of our custom Table that implements our `AuditBase` class with a `TenantId` to capture the Tenant the record should be saved to: ```csharp public class RockstarAuditTenant : AuditBase { [Index] public int TenantId { get; set; } [AutoIncrement] public int Id { get; set; } public string FirstName { get; set; } public string LastName { get; set; } public int? Age { get; set; } public DateTime DateOfBirth { get; set; } public DateTime? DateDied { get; set; } public LivingStatus LivingStatus { get; set; } } ``` Our service can now implement our base Audit & Multitenant enabled service: ```csharp public class CreateRockstarAuditTenant : CreateAuditTenantBase<RockstarAuditTenant, CreateRockstarResponse> { public string FirstName { get; set; } public string LastName { get; set; } public int? Age { get; set; } public DateTime DateOfBirth { get; set; } } ``` And all the decorated properties will be automatically populated when creating the Rockstar with `CreateRockstarAuditTenant`, e.g: ```csharp client.Post(new CreateRockstarAuditTenant { FirstName = "Kurt", LastName = "Cobain", Age = 27, DateOfBirth = new DateTime(20,2,1967), }); ``` We can create the same base classes for Updates: ```csharp [ValidateIsAuthenticated] [AutoPopulate(nameof(IAudit.ModifiedDate), Eval = "utcNow")] [AutoPopulate(nameof(IAudit.ModifiedBy), Eval = "userAuthName")] //or userAuthId [AutoPopulate(nameof(IAudit.ModifiedInfo), Eval = "`${userSession.DisplayName} (${userSession.City})`")] public abstract class UpdateAuditBase<Table,TResponse> : IUpdateDb<Table>, IReturn<TResponse> {} [AutoFilter(nameof(IAuditTenant.TenantId), Eval="Request.Items.TenantId")] public abstract class UpdateAuditTenantBase<Table,TResponse> : UpdateAuditBase<Table,TResponse> {} public class UpdateRockstarAuditTenant : UpdateAuditTenantBase<RockstarAuditTenant, RockstarWithIdResponse> { public int Id { get; set; } public string FirstName { get; set; } public LivingStatus? LivingStatus { get; set; } } ``` ::: info the `[AutoPopulate]` properties only appear on the Data Model, not the external Request DTO since we don't want external API consumers to populate them. ::: For Apps that prefer to never delete rows and instead mark records as deleted so an audit trail is retained, we can implement "Soft Deletes" using an UPDATE to populate the `SoftDelete*` fields behind-the-scenes: ```csharp [ValidateIsAuthenticated] [AutoPopulate(nameof(IAudit.SoftDeletedDate), Eval = "utcNow")] [AutoPopulate(nameof(IAudit.SoftDeletedBy), Eval = "userAuthName")] //or userAuthId [AutoPopulate(nameof(IAudit.SoftDeletedInfo), Eval = "`${userSession.DisplayName} (${userSession.City})`")] public abstract class SoftDeleteAuditBase<Table,TResponse> : IUpdateDb<Table>, IReturn<TResponse> {} [AutoFilter(QueryTerm.Ensure, nameof(IAuditTenant.TenantId), Eval = "Request.Items.TenantId")] public abstract class SoftDeleteAuditTenantBase<Table,TResponse> : SoftDeleteAuditBase<Table,TResponse> {} public class SoftDeleteAuditTenant : SoftDeleteAuditTenantBase<RockstarAuditTenant, RockstarWithIdResponse> { public int Id { get; set; } } ``` To implement a "Real" permanently destructive DELETE you would instead implement `IDeleteDb<T>`: ```csharp [ValidateIsAuthenticated] [AutoFilter(QueryTerm.Ensure, nameof(IAuditTenant.TenantId), Eval = "Request.Items.TenantId")] public class RealDeleteAuditTenant : IDeleteDb<RockstarAuditTenant>, IReturn<RockstarWithIdResponse> { public int Id { get; set; } public int? Age { get; set; } } ``` ### Multi RDBMS Services As they're just regular ServiceStack Services everything you’re used to that works with normal services also works with new Auto Crud Services, to recap you can annotate the **DB Model** with the `[NamedConnection]` attribute to specify which [registered named connection](/multitenancy#changedb-apphost-registration) AutoQuery should use: ```csharp [NamedConnection("Reporting")] public class NamedRockstar : Rockstar { } //DB Model ``` Where all AutoQuery Services for that data model will query the **Reporting** database instead: ```csharp public class CreateNamedRockstar : RockstarBase, ICreateDb<NamedRockstar>, IReturn<RockstarWithIdAndResultResponse> { public int Id { get; set; } } public class UpdateNamedRockstar : RockstarBase, IUpdateDb<NamedRockstar>, IReturn<RockstarWithIdAndResultResponse> { public int Id { get; set; } } ``` #### Custom AutoQuery CRUD Services Alternatively the `[ConnectionInfo]` can be [used on Service implementations](/multitenancy#connectioninfo-attribute), but as AutoQuery doesn't have them you'd need to provide custom implementations that can delegate to their respective Auto Crud API, e.g: ```csharp [ConnectionInfo(NamedConnection = MyDatabases.Reporting)] public class MyReportingServices(IAutoQueryDb autoQuery) : Service { public Task<object> Any(CreateConnectionInfoRockstar request) => autoQuery.CreateAsync(request, Request); public Task<object> Any(UpdateConnectionInfoRockstar request) => autoQuery.UpdateAsync(request, Request); } ``` ### AutoFilter If you're creating Soft Delete & Multi tenant services you'll want to ensure that every query only returns records in their tenant and doesn't return deleted items, which we can implement using an `[AutoFilter]`, e.g: ```csharp [ValidateIsAuthenticated] [AutoFilter(QueryTerm.Ensure, nameof(IAudit.SoftDeletedDate), Template = SqlTemplate.IsNull)] [AutoFilter(QueryTerm.Ensure, nameof(IAuditTenant.TenantId), Eval = "Request.Items.TenantId")] public abstract class QueryDbTenant<From, Into> : QueryDb<From, Into> {} ``` The `[AutoFilter]` lets you add pre-configured filters to the query, `QueryTerm.Ensure` utilizes OrmLite's new `Ensure()` APIs which forces always applying this filter, even if the query contains other `OR` conditions. This base class will then let you create concrete queries that doesn't return soft deleted rows and only returns rows from the same tenant as the authenticated user, e.g: ```csharp public class QueryRockstarAudit : QueryDbTenant<RockstarAuditTenant, Rockstar> { public int? Id { get; set; } } ``` To coincide with AutoCRUD there's also support for [declarative validation](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.Endpoints.Tests/AutoQueryCrudTests.Validate.cs) which thanks to [#Script](https://sharpscript.net/) lets you define your Fluent Validation Rules by annotating your Request DTO properties. As it's essentially a different way to define Fluent Validation Rules, it still [needs Validation enabled](/validation#validation-feature). ### AutoMap and AutoDefault Attributes The `[AutoDefault]` attribute allows you to specify default values that the Data Model should be populated with using the same `#Script` expression support available in `[AutoPopulate]` to populate constant values, cached constant expressions or results of full evaluated expressions. The `[AutoMap]` attributes enables the flexibility of being able to maintain different external property names from their internal data models, but still be able to declaratively map them. Here's an example `ICreateDb<T>` AutoCrud Service that makes use of both these attributes to achieve its desired behavior: ```csharp public class CreateRockstarAutoMapDefault : ICreateDb<Rockstar>, IReturn<RockstarWithIdResponse> { [AutoMap(nameof(Rockstar.FirstName))] public string MapFirstName { get; set; } [AutoMap(nameof(Rockstar.LastName))] public string MapLastName { get; set; } [AutoMap(nameof(Rockstar.Age))] [AutoDefault(Value = 21)] public int? MapAge { get; set; } [AutoMap(nameof(Rockstar.DateOfBirth))] [AutoDefault(Expression = "date(2001,1,1)")] public DateTime MapDateOfBirth { get; set; } [AutoMap(nameof(Rockstar.DateDied))] [AutoDefault(Eval = "utcNow")] public DateTime? MapDateDied { get; set; } [AutoMap(nameof(Rockstar.LivingStatus))] [AutoDefault(Value = LivingStatus.Dead)] public LivingStatus? MapLivingStatus { get; set; } } ``` ### AutoIgnore Attributes To send additional properties with your AutoQuery CRUD Request DTO which doesn't match the data model you can ignore the validation check by annotating properties with the `[AutoIgnore]` Attribute, e.g: ```csharp public class CustomRockstarService : ICreateDb<Rockstar>, IReturn<RockstarWithIdResponse> { public int Id { get; set; } public int? Age { get; set; } [AutoIgnore] public CustomInfo CustomInfo { get;set; } } ``` Or you can ignore validation for all properties with the same name by registering it to `AutoQuery.IgnoreCrudProperties`, e.g: ```csharp AutoQuery.IgnoreCrudProperties.Add(nameof(CustomInfo)); ``` ### Apply Generic CRUD Behaviors The AutoQuery Attributes are used to construct a metadata model of each operation used to enlist the desired functionality that each Service should have. This metadata model can also be programmatically constructed allowing you to codify conventions by grouping annotated attributes under a single `[AutoApply]` attribute resulting in the same behavior had the AutoQuery Request been annotated with the attributes directly, e.g: ```csharp [AutoApply(Behavior.AuditQuery)] public class QueryBookings { ... } // Equivalent to: [AutoFilter(QueryTerm.Ensure, nameof(AuditBase.DeletedDate), Template = SqlTemplate.IsNull)] public class QueryBookings { ... } [AutoApply(Behavior.AuditCreate)] public class CreateBooking { ... } // Equivalent to: [AutoPopulate(nameof(AuditBase.CreatedDate), Eval = "utcNow")] [AutoPopulate(nameof(AuditBase.CreatedBy), Eval = "userAuthName")] [AutoPopulate(nameof(AuditBase.ModifiedDate), Eval = "utcNow")] [AutoPopulate(nameof(AuditBase.ModifiedBy), Eval = "userAuthName")] public class CreateBooking { ... } ``` The `[AutoApply]` attribute is itself an inert marker for capturing what generic behavior you want applied to AutoQuery Services. All built-in behavior is declared on the `Behavior` static class: ```csharp public static class Behavior { // Auto Filter SoftDeleted Results public const string AuditQuery = nameof(AuditQuery); // Auto Populate CreatedDate, CreatedBy, ModifiedDate & ModifiedBy fields public const string AuditCreate = nameof(AuditCreate); // Auto Populate ModifiedDate & ModifiedBy fields public const string AuditModify = nameof(AuditModify); // Auto Populate DeletedDate & DeletedBy fields public const string AuditDelete = nameof(AuditDelete); // Auto Populate DeletedDate & DeletedBy fields and changes IDeleteDb operation to Update public const string AuditSoftDelete = nameof(AuditSoftDelete); } ``` #### AuditAutoCrudMetadataFilter This functionality is implemented by extending the metadata for AutoQuery CRUD Services with additional attributes in `AutoQueryFeature.AutoCrudMetadataFilters` delegate filters where they result in the same behavior as if the Request DTOs were annotated with attributes directly. E.g. Here's the built-in filter for implementing the above behaviors: ```csharp public static void AuditAutoCrudMetadataFilter(AutoCrudMetadata meta) { foreach (var applyAttr in meta.AutoApplyAttrs) { switch (applyAttr.Name) { case Behavior.AuditQuery: meta.Add(new AutoFilterAttribute( QueryTerm.Ensure, nameof(AuditBase.DeletedDate), SqlTemplate.IsNull)); break; case Behavior.AuditCreate: case Behavior.AuditModify: if (applyAttr.Name == Behavior.AuditCreate) { meta.Add(new AutoPopulateAttribute(nameof(AuditBase.CreatedDate)) { Eval = "utcNow" }); meta.Add(new AutoPopulateAttribute(nameof(AuditBase.CreatedBy)) { Eval = "userAuthName" }); } meta.Add(new AutoPopulateAttribute(nameof(AuditBase.ModifiedDate)) { Eval = "utcNow" }); meta.Add(new AutoPopulateAttribute(nameof(AuditBase.ModifiedBy)) { Eval = "userAuthName" }); break; case Behavior.AuditDelete: case Behavior.AuditSoftDelete: if (applyAttr.Name == Behavior.AuditSoftDelete) meta.SoftDelete = true; meta.Add(new AutoPopulateAttribute(nameof(AuditBase.DeletedDate)) { Eval = "utcNow" }); meta.Add(new AutoPopulateAttribute(nameof(AuditBase.DeletedBy)) { Eval = "userAuthName" }); break; } } } ``` You can use this same functionality to describe your own custom generic functionality, e.g. Lets say you wanted to instead populate your base class with Audit Info containing different named properties with **local** `DateTime` and UserAuth `Id`. You can define your own Behavior name for this functionality: ```csharp [AutoApply("MyUpdate")] public class UpdateBooking { ... } ``` and implement it with a custom `AutoCrudMetadataFilters` that populates the Audit `[AutoPopulate]` attributes on all Request DTOs marked with your Behavior name, e.g: ```csharp void MyAuditFilter(AutoCrudMetadata meta) { if (meta.HasAutoApply("MyUpdate")) { meta.Add(new AutoPopulateAttribute(nameof(MyBase.MyModifiedDate)) { Eval = "now" }); meta.Add(new AutoPopulateAttribute(nameof(MyBase.MyModifiedBy)) { Eval = "userAuthId" }); } } services.AddPlugin(new AutoQueryFeature { AutoCrudMetadataFilters = { MyAuditFilter }, }); ``` ### AutoQuery CRUD Events AutoQuery includes `OnBefore*` and `OnAfter*` (sync & async) events for `Create`, `Update`, `Patch` & `Delete` that can be used to execute custom logic before or after each AutoQuery CRUD operation. E.g. if your system implements their own Audit history via RDBMS triggers, you can use the `OnBefore` **Delete** event to update the record with deleted info before the AutoQuery CRUD operation deletes it: ```csharp services.AddPlugin(new AutoQueryFeature { OnBeforeDeleteAsync = async ctx => { if (ctx.Dto is DeleteBooking deleteBooking) { var session = await ctx.Request.GetSessionAsync(); await ctx.Db.UpdateOnlyAsync(() => new Booking { DeletedBy = session.UserAuthName, DeletedDate = DateTime.UtcNow, }, where: x => x.Id == deleteBooking.Id); } }, }); ``` ::: info AutoQuery generates **async** Services by default which will invoke the `*Async` events, but if you implement a [sync Custom AutoQuery CRUD Service](/autoquery/crud#custom-autoquery-crud-services) it executes the **sync** events instead so you'd need to implement the `OnBeforeDelete` custom hook instead. ::: ### Custom Complex Mapping Another opportunity to apply more complex custom mapping logic before resorting to creating an actual Service implementation is to make use of ServiceStack's built-in [Auto Mapping Populator API](/auto-mapping#intercept-automapping-conversions) to intercept an AutoMapping conversion between 2 types and apply custom logic after `ConvertTo<T>` or `PopulateWith<T>` APIs, e.g: ```csharp AutoMapping.RegisterPopulator((Dictionary<string,object> target, CreateRockstar source) => { if (!IsAlive(source)) { target[nameof(source.LivingStatus)] = LivingStatus.Dead; } }); ``` ### Auto Guid's In addition to supporting `[AutoIncrement]` to insert records with Auto Incrementing Ids, you can use `[AutoId]` to insert entities with [RDBMS generated UUIDs](/ormlite/reference-support#auto-populated-guid-ids) where they're supported otherwise OrmLite populates them with `Guid.NewGuid()`. ::: info usage of inheritance isn't required & has the same behavior as using explicit properties ::: ```csharp public abstract class RockstarBase { public string FirstName { get; set; } public string LastName { get; set; } public int? Age { get; set; } public DateTime DateOfBirth { get; set; } } public class Rockstar : RockstarBase { [AutoId] public Guid Id { get; set; } } public class CreateRockstarWithAutoGuid : RockstarBase, ICreateDb<Rockstar>, IReturn<RockstarWithIdResponse> { } ``` Or if you prefer for Id's to always be populated with `Guid.NewGuid()`, remove `[AutoId]` and populate it with `[AutoPopulate]` instead: ```csharp [AutoPopulate(nameof(Rockstar.Id), Eval = "nguid")] public class CreateRockstarWithAutoGuid : RockstarBase, ICreateDb<Rockstar>, IReturn<RockstarWithIdResponse> { } ``` ### Optimistic Concurrency We can declaratively add support for [OrmLite's Optimistic Concurrency](/ormlite/optimistic-concurrency) by including `ulong RowVersion` property on Auto Crud Request/Response DTOs and Data Models, e.g: ```csharp // Data Model public class RockstarVersion : RockstarBase { [AutoIncrement] public int Id { get; set; } public ulong RowVersion { get; set; } } public class CreateRockstarVersion : RockstarBase, ICreateDb<RockstarVersion>, IReturn<RockstarWithIdAndRowVersionResponse> { } public class UpdateRockstarVersion : RockstarBase, IPatchDb<RockstarVersion>, IReturn<RockstarWithIdAndRowVersionResponse> { public int Id { get; set; } public ulong RowVersion { get; set; } } // Response DTO public class RockstarWithIdAndRowVersionResponse { public int Id { get; set; } public ulong RowVersion { get; set; } public ResponseStatus ResponseStatus { get; set; } } ``` AutoQuery will populate the `RowVersion` in Response DTOs which will need to be provided whenever making changes to that entity where it will fail to update the entity if no `RowVersion` was provided or has since been modified: ```csharp var createResponse = client.Post(new CreateRockstarVersion { FirstName = "Original", LastName = "Version", Age = 20, DateOfBirth = new DateTime(2001,7,1), LivingStatus = LivingStatus.Dead, }); // throws OptimisticConcurrencyException: No RowVersion provided client.Patch(new UpdateRockstarVersion { Id = createResponse.Id, LastName = "UpdatedVersion", }); // succeeds if "Original Version" wasn't modified otherwise throws OptimisticConcurrencyException var response = client.Patch(new UpdateRockstarVersion { Id = createResponse.Id, LastName = "UpdatedVersion", RowVersion = createResponse.RowVersion, }); ``` ### MQ AutoQuery CRUD Requests As AutoQuery CRUD Services are just ServiceStack Services they can partake in its ecosystem of features like being able to [invoke Services via MQ](/messaging), although there's some extra consideration needed to account for the differences between HTTP and MQ Requests. First whatever filters you've added to populate the `IRequest.Items` like tenant Id you'll also need to register in `GlobalMessageRequestFilters` so they're executed for MQ Requests as well: ```csharp GlobalRequestFilters.Add(SetTenant); // HTTP Requests GlobalMessageRequestFilters.Add(SetTenant); // MQ Requests ``` Secondly Auth Information is typically sent in the HTTP Request Headers, but they need to be included in the Request DTO to send Authenticated MQ Requests, which can either implement `IHasSessionId` for normal [Session Auth Providers](/auth/authentication-and-authorization#session-authentication-overview), e.g: ```csharp public class CreateRockstarAuditTenant : CreateAuditTenantBase<RockstarAuditTenant, RockstarWithIdAndResultResponse>, IHasSessionId { public string SessionId { get; set; } //Authenticate MQ Requests //... } ``` Alternatively they can implement `IHasBearerToken` for [stateless Bearer Token](/auth/authentication-and-authorization#authentication-per-request-auth-providers) Auth providers like JWT or API Keys. If you're publishing an MQ Request inside a HTTP Service you can use the `PopulateRequestDtoIfAuthenticated` extension method which populates the Request DTO from the Authenticated HTTP Request, e.g: ```csharp public class AutoCrudMqServices : Service { public void Any(CreateRockstarAuditTenantMq request) { var mqRequest = request.ConvertTo<CreateRockstarAuditTenant>(); Request.PopulateRequestDtoIfAuthenticated(mqRequest); PublishMessage(mqRequest); } } ``` In this case if using [Background MQ](/background-mq), it will execute the `CreateRockstarAuditTenant` request in a background thread, populating the MQ Request Context with the session identified by the `IRequest.GetSessionId()`. #### Publishing Requests to OneWay Endpoint You can also send MQ requests directly by [publishing to the OneWay HTTP endpoint](/messaging#oneway-http-requests-are-published-to-mq-then-executed), which if your AppHost is registered with an MQ Server, it will publish the message to the MQ and auto populate Request DTOs that implements `IHasSessionId` or `IHasBearerToken`, either if implicitly sent from an Authenticated client: ```csharp var authResponse = authClient.Post(new Authenticate { provider = "credentials", UserName = "admin@email.com", Password = "p@55wOrd", RememberMe = true, }); authClient.SendOneWay(new CreateRockstarAuditTenant { FirstName = nameof(CreateRockstarAuditTenant), LastName = "SessionId", Age = 20, DateOfBirth = new DateTime(2002,2,2), }); ``` Or from an anonymous client with the explicit `BearerToken` or `SessionId` properties populated, e.g: ```csharp client.SendOneWay(new CreateRockstarAuditMqToken { BearerToken = JwtUserToken, FirstName = nameof(CreateRockstarAuditMqToken), LastName = "JWT", Age = 20, DateOfBirth = new DateTime(2002,2,2), }); ``` To save populating the `BearerToken` in each request, you can set it once on the Service Client which will automatically populate it on Request DTOs: ```csharp client.BearerToken = jwtUserToken; ``` ## AutoQuery CRUD Features Building upon AutoQuery is a number of other features that increase the capabilities of AutoQuery Services & provide instant utility, including: - [Declarative Validation](/declarative-validation) - [Executable Audit Log](/autoquery/audit-log) - [Instantly Servicify with AutoGen](/autoquery/autogen) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/csharp-add-servicestack-reference.md --- slug: csharp-add-servicestack-reference title: C# Add ServiceStack Reference --- [![](https://raw.githubusercontent.com/ServiceStackApps/HelloMobile/master/screenshots/splash-900.png)](https://github.com/ServiceStackApps/HelloMobile) The primary and most popular [Add ServiceStack Reference](/add-servicestack-reference) language supported is C#, providing a flexible alternative than sharing your DTO assembly with clients, now clients can easily add a reference to a remote ServiceStack instance and update DTO's directly from within VS.NET. This also lays the groundwork and signals our approach on adding support for typed API's in other languages in future. Add a [feature request for your favorite language](https://servicestack.net/ideas) to prioritize support for it sooner! Our goal with Native Types is to provide an alternative for sharing DTO dlls, that can enable a better dev workflow for external clients who are now able to generate (and update) Typed APIs for your Services from a remote url - reducing the burden and effort required to consume ServiceStack Services whilst benefiting from clients native language strong-typing feedback. C# Xamarin.Android Example in VS.NET <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="cbYuem1b2tg" style="background-image: url('https://img.youtube.com/vi/cbYuem1b2tg/maxresdefault.jpg')"></lite-youtube> ## Add ServiceStack Reference The easiest way to [Add a ServiceStack reference](/add-servicestack-reference) to your project is to right-click on your project to bring up [ServiceStackVS's](/templates/install-servicestackvs) `Add ServiceStack Reference` context-menu item. This opens a dialog where you can add the url of the ServiceStack instance you want to typed DTO's for, as well as the name of the DTO source file that's added to your project. [![Add ServiceStack Reference](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/apps/StackApis/add-service-ref-flow.png)](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/apps/StackApis/add-service-ref-flow.png) After clicking OK, the servers DTOs and [ServiceStack.Client](https://www.nuget.org/packages/ServiceStack.Client) NuGet package are added to the project, providing an instant typed API: [![Calling ServiceStack Service](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/apps/StackApis/call-service.png)](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/apps/StackApis/call-service.png) With the C# code generated on the Server, the role of [ServiceStackVS's](/create-your-first-webservice) **Add ServiceStack Reference** is then just to integrate the remote C# DTOs into the clients VS.NET project. This is just getting the generated DTOs from the server with default options set by the server and adding them locally to your project within Visual Studio. ![Add CSharp ServiceStack Reference Demo](https://github.com/ServiceStack/Assets/raw/master/img/servicestackvs/servicestack%20reference/addref-csharp.gif) ## Update ServiceStack Reference If your server has been updated and you want to update to client DTOs, simply right-click on the DTO file within VS.NET and select `Update ServiceStack Reference`. ![CSharp update demo](https://github.com/ServiceStack/Assets/raw/master/img/servicestackvs/servicestack%20reference/updateref-csharp.gif) ## Consuming Services from Mobile Clients Thanks to [ServiceStack.Client](https://www.nuget.org/packages/ServiceStack.Client) PCL Support, it can also be used from within supported client platforms. Here's a quick Android demo of adding a ServiceStack reference to [blazor-vue.web-templates.io](https://blazor-vue.web-templates.io) and consuming one of StackApi's Services: [![Android Add ServiceStack Reference](https://raw.githubusercontent.com/ServiceStack/ServiceStackVS/master/Images/android-add-ref-demo.gif)](https://raw.githubusercontent.com/ServiceStack/ServiceStackVS/master/Images/android-add-ref-demo.gif) ## DTO Customization Options The header comments in the generated DTOs allows for further customization of how they're generated where ServiceStackVS automatically watches for any file changes and updates the generated DTOs with any custom Options provided. Options that are preceded by a C# single line comment `//` are defaults from the server that can be overridden, e.g: ```csharp /* Options: Date: 2025-06-04 09:45:54 Version: 8.80 Tip: To override a DTO option, remove "//" prefix before updating BaseUrl: https://blazor-vue.web-templates.io //GlobalNamespace: //MakePartial: True //MakeVirtual: True //MakeInternal: False //MakeDataContractsExtensible: False //AddNullableAnnotations: False //AddReturnMarker: True //AddDescriptionAsComments: True //AddDataContractAttributes: False //AddIndexesToDataMembers: False //AddGeneratedCodeAttributes: False //AddResponseStatus: False //AddImplicitVersion: //InitializeCollections: False //ExportValueTypes: False //IncludeTypes: //ExcludeTypes: //AddNamespaces: //AddDefaultXmlNamespace: http://schemas.servicestack.net/types */ ``` To override these options on the client, the `//` has to be removed. For example, if we did not want our classes to be partial by default for the C# client, our options would look like below: ```csharp /* Options: Date: 2025-06-04 09:46:15 Version: 8.80 Tip: To override a DTO option, remove "//" prefix before updating BaseUrl: https://blazor-vue.web-templates.io //GlobalNamespace: MakePartial: False //MakeVirtual: True //MakeInternal: False //MakeDataContractsExtensible: False //AddNullableAnnotations: False //AddReturnMarker: True //AddDescriptionAsComments: True //AddDataContractAttributes: False //AddIndexesToDataMembers: False //AddGeneratedCodeAttributes: False //AddResponseStatus: False //AddImplicitVersion: //InitializeCollections: False //ExportValueTypes: False //IncludeTypes: //ExcludeTypes: //AddNamespaces: //AddDefaultXmlNamespace: http://schemas.servicestack.net/types */ ``` Options that do not start with a `//` are sent to the server to override any defaults set by the server. ### Change Default Server Configuration The above defaults are also overridable on the ServiceStack Server by modifying the default config on the `NativeTypesFeature` Plugin, e.g: ```csharp var nativeTypes = this.GetPlugin<NativeTypesFeature>(); nativeTypes.MetadataTypesConfig.MakeVirtual = false; ... ``` ### Customize DTO Type generation Additional C# specific customization can be statically configured like `PreTypeFilter`, `InnerTypeFilter` & `PostTypeFilter` (available in all languages) can be used to inject custom code in the generated DTOs output. Use the `PreTypeFilter` to generate source code before and after a Type definition, e.g. this will append the `[Validate]` attribute on non enum & interface types: ```csharp CSharpGenerator.PreTypeFilter = (sb, type) => { if (!type.IsEnum.GetValueOrDefault() && !type.IsInterface.GetValueOrDefault()) { sb.AppendLine("[Validate]"); } }; ``` The `InnerTypeFilter` gets invoked just after the Type Definition which can be used to generate common members for all Types and interfaces, e.g: ```csharp CSharpGenerator.InnerTypeFilter = (sb, type) => { sb.AppendLine("public string Id { get; } = Guid.NewGuid().ToString();"); }; ``` There's also `PrePropertyFilter` & `PostPropertyFilter` for generating source before and after properties, e.g: ```csharp CSharpGenerator.PrePropertyFilter = (sb , prop, type) => { if (prop.Name == "Id") { sb.AppendLine("[PrimaryKey]"); } }; ``` ### Emit custom code To enable greater flexibility when generating complex Typed DTOs, you can use `[Emit{Language}]` attributes to generate code before each type or property. These attributes can be used to generate different attributes or annotations to enable client validation for different validation libraries in different languages, e.g: ```csharp [EmitCSharp("[Validate]")] [EmitCode(Lang.CSharp | Lang.Swift | Lang.Dart, "// App User")] public class User : IReturn<User> { [EmitCSharp("[IsNotEmpty]","[IsEmail]")] [EmitCode(Lang.Swift | Lang.Dart, new[]{ "@isNotEmpty()", "@isEmail()" })] public string Email { get; set; } } ``` Which will generate `[EmitCsharp]` code in C# DTOs: ```csharp [Validate] // App User public partial class User : IReturn<User> { [IsNotEmpty] [IsEmail] public virtual string Email { get; set; } } ``` Whilst the generic `[EmitCode]` attribute lets you emit the same code in multiple languages with the same syntax. We'll go through and cover each of the above options to see how they affect the generated DTOs: ### GlobalNamespace Specify which namespace the generated C# DTOs should use: ```csharp namespace Acme { //... } ``` ### MakePartial Adds the `partial` modifier to all types, letting you extend generated DTOs with your own class separate from the generated types: ```csharp public partial class GetAnswers { ... } ``` ### MakeVirtual Adds the `virtual` modifier to all properties: ```csharp public partial class GetAnswers { ... public virtual int QuestionId { get; set; } } ``` ### MakeDataContractsExtensible Add .NET's DataContract's [ExtensionDataObject](http://msdn.microsoft.com/en-us/library/system.runtime.serialization.extensiondataobject(v=vs.110).aspx) to all DTO's: ```csharp public partial class GetAnswers : IReturn<GetAnswerResponse>, IExtensibleDataObject { ... public virtual ExtensionDataObject ExtensionData { get; set; } } ``` ### AddNullableAnnotations Generate DTOs with nullable reference types, e.g: ```csharp public class Data { public int Value { get; set; } public int? OptionalValue { get; set; } public string Text { get; set; } public string? OptionalText { get; set; } public List<string> Texts { get; set; } public List<string>? OptionalTexts { get; set; } } ``` Will generate DTOs, preserving properties with nullable reference type annotations: ```csharp public class Data { public virtual int Value { get; set; } public virtual int? OptionalValue { get; set; } public virtual string Text { get; set; } public virtual string? OptionalText { get; set; } public virtual List<string> Texts { get; set; } = []; public virtual List<string>? OptionalTexts { get; set; } } ``` Optionally if your DTOs do not have nullable reference annotations enabled but you would still like to generate DTOs with them included, you can mark properties as required with the `[Required]` attribute, e.g: ```csharp public class Data { [Required] public string? Text { get; set; } [Required] public List<string>? Texts { get; set; } } ``` Where it will generate otherwise optional properties as non-nullable reference types: ```csharp public class Data { [Required] public virtual string Text { get; set; } [Required] public virtual List<string> Texts { get; set; } = []; } ``` ### AddReturnMarker When true, annotates Request DTOs with an `IReturn<TResponse>` marker referencing the Response type ServiceStack infers your Service to return: ```csharp public class GetAnswers : IReturn<GetAnswersResponse> { ... } ``` > Original DTO doesn't require a return marker as response type can be inferred from Services return type or when using the `%Response` DTO Naming convention ### AddDescriptionAsComments Converts any textual Description in `[Description]` attributes as C# Doc comments which allows your API to add intellisense in client projects: ```csharp ///<summary> ///Get a list of Answers for a Question ///</summary> public class GetAnswers { ... } ``` ### AddDataContractAttributes Decorates all DTO types with `[DataContract]` and properties with `[DataMember]` as well as adding default XML namespaces for all C# namespaces used: ```csharp [assembly: ContractNamespace("http://schemas.servicestack.net/types", ClrNamespace="StackApis.ServiceModel.Types")] [assembly: ContractNamespace("http://schemas.servicestack.net/types", ClrNamespace="StackApis.ServiceModel")] ... [DataContract] public partial class GetAnswers { [DataMember] public virtual int QuestionId { get; set; } } ``` ### AddIndexesToDataMembers Populates a `DataMember` Order index for all properties: ```csharp [DataContract] public partial class GetAnswers { [DataMember(Order=1)] public virtual int QuestionId { get; set; } } ``` > Requires AddDataContractAttributes=true ### AddGeneratedCodeAttributes Emit `[GeneratedCode]` attribute on all generated Types: ```csharp [GeneratedCode] public partial class GetAnswers { ... } ``` ### AddResponseStatus Automatically add a `ResponseStatus` property on all Response DTOs, regardless if it wasn't already defined: ```csharp public class GetAnswersResponse { ... public ResponseStatus ResponseStatus { get; set; } } ``` ### AddImplicitVersion Usage: ``` /* Options: AddImplicitVersion: 1 ``` Lets you specify the Version number to be automatically populated in all Request DTOs sent from the client: ```csharp public partial class GetAnswers : IReturn<GetAnswersResponse> { public virtual int Version { get; set; } public GetAnswers() { Version = 1; } ... } ``` This lets you know what Version of the Service Contract that existing clients are using making it easy to implement ServiceStack's [recommended versioning strategy](http://stackoverflow.com/a/12413091/85785). ### InitializeCollections Usage: ``` /* Options: InitializeCollections: True ``` Lets you automatically initialize collections in Request DTOs: ```csharp public class SearchQuestions { public SearchQuestions() { Tags = new List<string>{}; } public List<string> Tags { get; set; } ... } ``` Initialized collections lets you take advantage of C#'s collection initializers for a nicer client API: ```csharp var response = client.Get(new SearchQuestions { Tags = { "redis", "ormlite" } }); ``` ### IncludeTypes Is used as a Whitelist to specify only the types you would like to have code-generated: ``` /* Options: IncludeTypes: GetTechnology,GetTechnologyResponse ``` Will only generate `GetTechnology` and `GetTechnologyResponse` DTO's: ```csharp public class GetTechnology { ... } public class GetTechnologyResponse { ... } ``` #### Include Generic Types Use .NET's Type Name to include Generic Types, i.e. the Type name separated by the backtick followed by the number of generic arguments, e.g: ``` IncludeTypes: IReturn`1,MyPair`2 ``` #### Include Request DTO and its dependent types You can include a Request DTO and all its dependent types with a `.*` suffix on the Request DTO, e.g: ``` /* Options: IncludeTypes: GetTechnology.* ``` Which will include the `GetTechnology` Request DTO, the `GetTechnologyResponse` Response DTO and all Types that they both reference. #### Include All Types within a C# namespace If your DTOs are grouped into different namespaces they can be all included using the `/*` suffix, e.g: ``` /* Options: IncludeTypes: MyApp.ServiceModel.Admin/* ``` This will include all DTOs within the `MyApp.ServiceModel.Admin` C# namespace. #### Include All Services in a Tag Group Services [grouped by Tag](/api-design#group-services-by-tag) can be used in the `IncludeTypes` where tags can be specified using braces in the format `{tag}` or `{tag1,tag2,tag3}`, e.g: ``` /* Options: IncludeTypes: {web,mobile} ``` Or individually: ``` /* Options: IncludeTypes: {web},{mobile} ``` ### ExcludeTypes Is used as a Blacklist to specify which types you would like excluded from being generated: ``` /* Options: ExcludeTypes: GetTechnology,GetTechnologyResponse ``` Will exclude `GetTechnology` and `GetTechnologyResponse` DTOs from being generated. ### AddNamespaces Include additional C# namespaces, e.g: ``` /* Options: AddNamespaces: System.Drawing,MyApp ``` Where it will generate the specified namespaces in the generated Types: ```csharp using System.Drawing; using MyApp; ``` ### AddDefaultXmlNamespace This lets you change the default DataContract XML namespace used for all C# namespaces: ```csharp [assembly: ContractNamespace("http://my.types.net", ClrNamespace="StackApis.ServiceModel.Types")] [assembly: ContractNamespace("http://my.types.net", ClrNamespace="StackApis.ServiceModel")] ``` > Requires AddDataContractAttributes=true --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/csharp-client.md --- slug: csharp-client title: C#/.NET Service Clients --- Using DTOs to define your web service interface makes it possible to provide strong-typed generic service clients without any code-gen or extra build-steps, leading to a productive end-to-end type-safe communication gateway from client to server. <lite-youtube class="w-full mx-4 my-4" width="560" height="315" videoid="cbYuem1b2tg" style="background-image: url('https://img.youtube.com/vi/cbYuem1b2tg/maxresdefault.jpg')"></lite-youtube> **ServiceStack.Client** is the primary NuGet package containing ServiceStack's client libraries that can be included in your `.csproj` with: :::copy `<PackageReference Include="ServiceStack.Client" Version="10.*" />` ::: Earlier **.NET 6.0** can use the [HttpClient-based JsonHttpClient](/csharp-client#jsonhttpclient) in: :::copy `<PackageReference Include="ServiceStack.HttpClient" Version="10.*" />` ::: ### JsonApiClient From **.NET 6+** it's recommended to use the newest [JsonApiClient](/releases/v6#jsonapiclient) released in v6+: ```csharp var client = new JsonApiClient(baseUri); ``` ### HttpClient Factory Registration In client Apps that support it, the recommendation is to use a HttpClient Factory which can be done to register the `JsonApiClient` dependency in your App with: ```csharp builder.Services.AddJsonApiClient(builder.Configuration["BaseUrl"]); ``` It's now recommended to use `JsonApiClient` when it's available, but for simplification the docs will continue to reference the substitutable & more broadly available `JsonServiceClient`. #### Blazor Client Registration **Blazor WASM** should instead use the tailored `AddBlazorApiClient()` which also configures a CORS-enabled typed `JsonApiClient`: ```csharp builder.Services.AddBlazorApiClient(builder.Configuration["ApiBaseUrl"] ?? builder.HostEnvironment.BaseAddress); ``` ### Setup All ServiceStack's C# clients share the same interfaces and are created by passing in the **Base URI** of your ServiceStack service in the clients constructor, e.g. if your ServiceStack instance was hosted on the root path `/` on the **5001** custom port: ```csharp var client = new JsonApiClient("https://host:5001"); ``` Or if hosted on the `/custom` custom path: ```csharp var client = new JsonApiClient("https://host/custom/"); ``` ### Recommended ServiceClient for .NET 6+# Going forward we'll continue improving `JsonApiClient` with new .NET runtime features and optimizations as they're available and now that .NET's `HttpWebRequest` has been officially **deprecated in .NET 6+** we recommend switching to use `JsonApiClient` in .NET 6+ runtimes. ### Safe Sync HttpClient APIs Until adding **net6.0** TFM builds there was no officially supported way to perform synchronous requests with `HttpClient`, to implement the complete `IServiceClient` interface, `JsonHttpClient` had to adopt the least problematic sync-over-async solution. `JsonApiClient` improves its synchronous support by rewriting all Sync methods to use HttpClient's new blocking `Send()` method. Whilst Blocking I/O continues to impact scalability, it's nice to finally have an officially supported safe method to use free from deadlock concerns. ## High level `Api` and `ApiAsync` methods .NET was originally conceived to use Exceptions for error control flow however there's been a tendency in modern languages & libraries to shun Exceptions and return errors as normal values, an approach we believe is a more flexible & ergonomic way to handle API responses. ### The ApiResult way The new APIs simply returns a typed `ApiResult<Response>` Value Result that encapsulates either a Typed Response or a structured API Error populated in `ResponseStatus` allowing you to handle API responses programmatically without `try/catch` handling: ```csharp var api = client.Api(new Hello { Name = name }); if (api.Failed) Console.WriteLine($"Greeting failed! {api.Error.ErrorMessage}"); else Console.WriteLine($"API Says: {api.Response.Result}"); //api.Succeeded ``` ### C# Example A preview of what this looks like is visible in [Blazor WASMs Dev Model Preview](/templates/blazor-bootstrap#api-and-apiasync-methods) example code to create a new Booking: ```csharp CreateBooking request = new(); ApiResult<IdResponse> api = new(); async Task OnSubmit() { api = await Client.ApiAsync(request); if (api.Succeeded) { await done.InvokeAsync(api.Response!); request = new(); } } ``` Which despite its terseness handles both **success** and **error** API responses, **if successful** it invokes the `done()` callback notifying its parent of the new Booking API Response before resetting the Form's data model with a new Request DTO. Upon **failure** the error response is populated in `api.Error` which binds to the UI via Blazor's `<CascadingValue Value=@api.Error>` to propagate it to all its child components in order to show contextual validation errors next to their respective Input controls. ### Available in all .NET and TypeScript Clients The new `Api` and `ApiAsync` methods is available in all .NET Service Clients, including [Service Gateway's](/service-gateway). ## REST API In addition, the Service Clients provide HTTP verbs (Get, Post & PostFile, Put, Delete, Patch, etc) enabling a productive typed API for consuming ServiceStack Services with their best matching Custom Routes as seen in the examples below: > See [IServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/IServiceClient.cs) for the full API available ### Using the recommended [API Design](/api-design) ```csharp HelloResponse response = client.Get(new Hello { Name = "World!" }); response.Result.Print(); ``` #### Async Example Using C# `await`: ```csharp HelloResponse response = await client.GetAsync( new Hello { Name = "World!" }); ``` ### Alternative API ```csharp var response = client.Get<HelloResponse>("/hello/World!"); response.Result.Print(); ``` #### Async Example ```csharp var response = await client.GetAsync<HelloResponse>("/hello/World!"); ``` ## Service Client API C#/.NET Clients can call the above Hello Service using any of the JSON, JSV, XML or SOAP Service Clients with the code below: ### Using the recommended [API Design](/api-design) ```csharp var response = client.Send(new Hello { Name = "World!" }); response.Result.Print(); ``` #### Async Example ```csharp var response = await client.SendAsync(new Hello { Name = "World!" }); response.Result.Print(); ``` ### Alternative API ```csharp var response = client.Send<HelloResponse>(new Hello { Name = "World!" }); response.Result.Print(); ``` #### Async Example ```csharp var response = await client.SendAsync<HelloResponse>( new Hello { Name = "World!" }); ``` The service clients use the automatic [pre-defined routes](/endpoints) for each service. ### File Upload with Request The `PostFileWithRequest*` methods can be used to upload a file with an API Request. Here's an example calling [AI Server's](/ai-server/) `SpeechToText` API: ### C# Speech to Text ```csharp using var fsAudio = File.OpenRead("audio.wav"); var response = client.PostFileWithRequest(new SpeechToText(), new UploadFile("audio.wav", fsAudio, "audio")); ``` ### Multiple File Uploads Whilst the `PostFilesWithRequest*` methods can be used to upload multiple files with an API Request, e.g: ### C# Watermark Video ```csharp using var fsVideo = File.OpenRead("video.mp4"); using var fsWatermark = File.OpenRead("watermark.png"); var response = client.PostFilesWithRequest(new QueueWatermarkVideo { Position = WatermarkPosition.BottomRight }, [ new UploadFile("video.mp4", fsVideo, "video"), new UploadFile("watermark.png", fsWatermark, "watermark") ]); ``` <a name="native-responses"></a> ### [Cache Aware Service Clients](/cache-aware-clients) When [caching is enabled on Services](/http-caching), the Cache-aware Service Clients can dramatically improve performance by eliminating server requests entirely as well as reducing bandwidth for re-validated requests. They also offer an additional layer of resiliency as re-validated requests that result in Errors will transparently fallback to using pre-existing locally cached responses. For bandwidth-constrained environments like Mobile Apps they can dramatically improve the User Experience. The Cache-Aware clients implement the full `IServiceClient` interface so they should be an easy drop-in enhancement for existing Apps: ```csharp IServiceClient client = new JsonServiceClient(baseUrl).WithCache(); //equivalent to: IServiceClient client = new CachedServiceClient(new JsonServiceClient(baseUrl)); ``` Likewise for the HttpClient-based `JsonHttpClient`: ```csharp IServiceClient client = new JsonHttpClient(baseUrl).WithCache(); //equivalent to: IServiceClient client = new CachedHttpClient(new JsonHttpClient(baseUrl)); ``` ## Support for Native built-in Response Types All of ServiceStack's generic Service Clients also allow you to fetch raw `string`, `byte[]` and `Stream` responses of any existing service, or when you need it, the underlying `HttpWebResponse` allowing fine-grained access to the HTTP Response. e.g With just the Service below: ```csharp [Route("/poco/{Text}")] public class Poco : IReturn<PocoResponse> { public string Text { get; set; } } public class PocoResponse { public string Result { get; set; } } public class NativeTypesExamples : Service { public PocoResponse Any(Poco request) { base.Response.AddHeader("X-Response", request.Text); return new PocoResponse { Result = "Hello, " + (request.Text ?? "World!") }; } } ``` You can access it normally with the typed API: ```csharp PocoResponse response = client.Get(new Poco { Text = "World" }); response.Result //Hello, World ``` Or as get the JSON as a raw string: ```csharp string responseJson = client.Get<string>("/poco/World"); var dto = responseJson.FromJson<PocoResponse>(); dto.Result //Hello, World ``` Or as raw bytes: ```csharp byte[] responseBytes = client.Get<byte[]>("/poco/World"); var dto = responseBytes.FromUtf8Bytes().FromJson<PocoResponse>(); dto.Result //Hello, World ``` Or as a Stream: ```csharp using Stream responseStream = client.Get<Stream>("/poco/World"); var dto = responseStream.ReadFully() .FromUtf8Bytes() .FromJson<PocoResponse>(); dto.Result //Hello, World ``` Async download & write to file example: ```csharp using var stream = await client.GetAsync<Stream>(new GetFile { Path = "/path/to/file.png" }); using var fs = File.Create(Path.Combine(uploadsDir, "file.png")); await stream.CopyToAsync(fs); ``` Or even access the populated `HttpWebResponse` object: ```csharp HttpWebResponse webResponse = client.Get<HttpWebResponse>("/poco/World"); webResponse.Headers["X-Response"] //= World using var stream = webResponse.GetResponseStream(); using var sr = new StreamReader(stream); var dto = sr.ReadToEnd().FromJson<PocoResponse>(); dto.Result //Hello, World ``` ### Accessing raw service responses ServiceStack isn't limited to just returning POCO's as you can effectively [return anything you want](/service-return-types) even images [/helloimage/ServiceStack?Width=600&height=300&Foreground=Yellow](https://test.servicestack.net/image-draw/ServiceStack?Width=600&height=300&Foreground=Yellow). These native responses can also be mark on your Request DTO `IReturn<T>` interface marker to give you a terse end-to-end API for fetching raw responses, e.g: ```csharp [Route("/headers/{Text}")] public class Headers : IReturn<HttpWebResponse> { public string Text { get; set; } } [Route("/strings/{Text}")] public class Strings : IReturn<string> { public string Text { get; set; } } [Route("/bytes/{Text}")] public class Bytes : IReturn<byte[]> { public string Text { get; set; } } [Route("/streams/{Text}")] public class Streams : IReturn<Stream> { public string Text { get; set; } } public class BuiltInTypesService : Service { public void Any(Headers request) { base.Response.AddHeader("X-Response", request.Text); } public string Any(Strings request) { return "Hello, " + (request.Text ?? "World!"); } public byte[] Any(Bytes request) { return new Guid(request.Text).ToByteArray(); } public byte[] Any(Streams request) { return new Guid(request.Text).ToByteArray(); } } ``` ### Accessing client raw responses Which let you access the results as you would a normal response: ```csharp using HttpWebResponse response = client.Get(new Headers { Text = "World" }); response.Headers["X-Response"] // "World" string response = client.Get(new Strings { Text = "World" }); response // Hello, World byte[] response = client.Get(new Bytes { Text = Guid.NewGuid().ToString() }); var guid = new Guid(response); using Stream stream = client.Get(new Streams { Text = Guid.NewGuid().ToString() }); var guid = new Guid(response.ReadFully()); ``` All these APIs are also available asynchronously as well: ```csharp using HttpWebResponse response = await client.GetAsync( new Strings { Text = "Test" }); response.Headers["X-Response"] // "World" string response = await client.GetAsync( new Strings { Text = "World" }); response // Hello, World byte[] response = await client.GetAsync(new Bytes { Text = Guid.NewGuid().ToString() }); var guid = new Guid(response); using Stream stream = await client.GetAsync(new Streams { Text = Guid.NewGuid().ToString() }); var guid = new Guid(response.ReadFully()); ``` ::: warning You must explicitly dispose all APIs returning either `HttpWebResponse` or `Stream` as seen in the above examples. ::: They all behave the same as the sync versions except for `HttpWebResponse` which gets returned just after the request is sent (asynchronously) and before any response is read so you can still access the HTTP Headers e.g: ```csharp var client = new JsonServiceClient("http://localhost:2020/") { ResponseFilter = httpRes => { var header = httpRes.Headers["X-Response"]; } }; var response = await client.GetAsync(new Headers { Text = "World" }); ``` Which makes a great starting point if you want to stream the responses back asynchronously as seen in this [Reactive ServiceStack example](https://gist.github.com/bamboo/5078236) by [@rodrigobamboo](https://twitter.com/rodrigobamboo). More examples can be found in the ServiceClients [Built-in native type response tests](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.Endpoints.Tests/ServiceClientsBuiltInResponseTests.cs) ## Sending Raw Data .NET Service Clients can also send raw `string`, `byte[]` or `Stream` Request bodies in their custom Sync or Async API's, e.g: ```csharp string json = "{\"Key\":1}"; client.Post<SendRawResponse>("/sendraw", json); byte[] bytes = json.ToUtf8Bytes(); client.Put<SendRawResponse>("/sendraw", bytes); Stream stream = new MemoryStream(bytes); await client.PostAsync<SendRawResponse>("/sendraw", stream); ``` ### Sending Typed Request with Raw Body The `*Body` and `*BodyAsync` APIs have avaialble in all Service Clients lets you post a separate Request Body for Request DTOs that implement `IRequiresRequestStream` where they contain both properties and a custom Request Body, e.g: ```csharp [Route("/json")] public class SendJson : IRequiresRequestStream, IReturn<string> { public string Name { get; set; } public Stream RequestStream { get; set; } } [Route("/text")] public class SendText : IRequiresRequestStream, IReturn<string> { public string Name { get; set; } public string ContentType { get; set; } public Stream RequestStream { get; set; } } public class SendRawService : Service { [JsonOnly] public object Any(SendJson request) => request.RequestStream.ReadFully(); public object Any(SendText request) { base.Request.ResponseContentType = request.ContentType ?? base.Request.AcceptTypes[0]; return request.RequestStream.ReadFully(); } } ``` The new APIs accept both a Request DTO which specifies which Service to call and what properties to add to the QueryString and another object to send in the raw HTTP Request Body, e.g: ```csharp var client = new JsonServiceClient(BaseUrl); var json = client.PostBody(new SendJson { Name = "JSON body" }, new PocoRequest { Foo = "Bar" }); json.FromJson<PocoRequest>().Foo //= Bar json = await client.PutBodyAsync(new SendJson { Name = "JSON body" }, "{\"Foo\":\"Bar\"}"); json.FromJson<PocoRequest>().Foo //= Bar var client = new JsonHttpClient(BaseUrl); var request = new SendText { Name = "Text body", ContentType = "text/plain" }; var text = await client.PostBodyAsync(request, "foo"); text //= foo ``` ## Client / Server Request Compression You can elect to compress HTTP Requests in any C#/.NET Service Clients by specifying the Compression Type you wish to use, e.g: ```csharp var client = new JsonServiceClient(baseUrl) { RequestCompressionType = CompressionTypes.GZip, }; var client = new JsonHttpClient(baseUrl) { RequestCompressionType = CompressionTypes.Deflate, }; var response = client.Post(new Request { ... }); ``` Where sending any HTTP Request containing a Request Body (e.g. POST/PUT) will send a compressed Request body to the Server where it's now able to be transparently decompressed and deserialized into your Request DTO. ## Authentication ServiceStack's [Auth Tests](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.Endpoints.Tests/AuthTests.cs#L108) shows different ways of authenticating when using the C# Service Clients. By default BasicAuth and DigestAuth is built into the clients, e.g: ```csharp var client = new JsonServiceClient(baseUri) { UserName = UserName, Password = Password, }; var request = new Secured { Name = "test" }; var response = client.Send<SecureResponse>(request); ``` Behind the scenes ServiceStack will attempt to send the request normally but when the request is rejected and challenged by the Server the clients will automatically retry the same request but this time with the Basic/Digest Auth headers. To skip the extra hop when you know you're accessing a secure service, you can tell the clients to always send the BasicAuth header with: ```csharp client.AlwaysSendBasicAuthHeader = true; ``` ### Sending Authenticate Request DTO The alternative way to Authenticate is to make an explicit call to the `Authenticate` service (this requires `CredentialsAuthProvider` enabled) e.g: ```csharp AuthenticateResponse authResponse = client.Post(new Authenticate { provider = CredentialsAuthProvider.Name, //= credentials UserName = "user", Password = "p@55word", RememberMe = true, //important tell client to retain permanent cookies }); var request = new Secured { Name = "test" }; var response = client.Send<SecureResponse>(request); ``` After a successful call to the `Authenticate` service the client is Authenticated and if **RememberMe** is set, the client will retain the Session Cookies added by the Server on subsequent requests which is what enables future requests from that client to be authenticated. ### Request and Response Filters When needing to execute custom logic before and after requests are sent and received you can use Global Request/Response Filters: ```csharp // Executed for all .NET HttpWebRequest ServiceClient instances like JsonServiceClient: ServiceClientBase.GlobalRequestFilter = (HttpWebRequest req) => { ... }; ServiceClientBase.GlobalResponseFilter = (HttpWebResponse res) => { ... }; // Executed for all JsonHttpClient instances JsonHttpClient.GlobalRequestFilter = (HttpRequestMessage req) => { ... }; JsonHttpClient.GlobalResponseFilter = (HttpResponseMessage res) => { ... }; ``` Or use instance Request/Response Filters if you only want to run custom logic for a specific instances: ```csharp var client = new JsonServiceClient(baseUrl) { RequestFilter = req => { ... }, ResponseFilter = res => { ... }, } var client = new JsonHttpClient(baseUrl) { RequestFilter = req => { ... }, ResponseFilter = res => { ... }, } ``` ### Upload and Download Progress on Async API's The Async API's support on progress updates with the `OnDownloadProgress` and `OnUploadProgress` callbacks which can be used to provide UX Progress updates, e.g: ```csharp var client = new JsonServiceClient(ListeningOn); //Available in ASP.NET/HttpListener when downloading responses with known lengths //E.g: Strings, Files, etc. client.OnDownloadProgress = (done, total) => "{0}/{1} bytes downloaded".Print(done, total); var response = await client.GetAsync(new Request()); ``` ::: info total = -1 when 'Transfer-Encoding: chunked' ::: Whilst the `OnUploadProgress` callback gets fired when uploading files, e.g: ```csharp client.OnUploadProgress = (bytesWritten, total) => "Written {0}/{1} bytes...".Print(bytesWritten, total); client.PostFileWithRequest<UploadResponse>(url, new FileInfo(path), new Upload { CreatedBy = "Me" }); ``` ### Custom Client Caching Strategy The `ResultsFilter` and `ResultsFilterResponse` delegates on Service Clients can be used to enable a custom caching strategy. Here's a basic example implementing a cache for all **GET** Requests: ```csharp var cache = new Dictionary<string, object>(); client.ResultsFilter = (type, method, uri, request) => { if (method != HttpMethods.Get) return null; object cachedResponse; cache.TryGetValue(uri, out cachedResponse); return cachedResponse; }; client.ResultsFilterResponse = (webRes, response, method, uri, request) => { if (method != HttpMethods.Get) return; cache[uri] = response; }; //Subsequent requests returns cached result var response1 = client.Get(new GetCustomer { CustomerId = 5 }); var response2 = client.Get(new GetCustomer { CustomerId = 5 }); //cached response ``` The `ResultsFilter` delegate is executed with the context of the request before the request is made. Returning a value of type `TResponse` short-circuits the request and returns that response. Otherwise the request continues and its response passed into the `ResultsFilterResponse` delegate where it can be cached. ### Implicitly populate SessionId and Version Number Service Clients can be used to auto-populate Request DTO's implementing `IHasSessionId` or `IHasVersion` by assigning the `Version` and `SessionId` properties on the Service Client, e.g: ```csharp client.Version = 1; client.SessionId = authResponse.SessionId; ``` Which populates the SessionId and Version number on each Request DTO's that implementing the specific interfaces, e.g: ```csharp public class Hello : IReturn<HelloResponse>, IHasSessionId, IHasVersion { public int Version { get; set; } public string SessionId { get; set; } public string Name { get; set; } } client.Get(new Hello { Name = "World" }); //Auto populates Version and SessionId ``` ### HTTP Verb Interface Markers You can decorate your Request DTO's using the `IGet`, `IPost`, `IPut`, `IDelete` and `IPatch` interface markers and the `Send` and `SendAsync` API's will use it to automatically send the Request using the selected HTTP Method. E.g: ```csharp public class HelloByGet : IGet, IReturn<HelloResponse> { public string Name { get; set; } } public class HelloByPut : IPut, IReturn<HelloResponse> { public string Name { get; set; } } var response = client.Send(new HelloByGet { Name = "World" }); //GET await client.SendAsync(new HelloByPut { Name = "World" }); //PUT ``` Interface markers is supported in all .NET Service Clients, they're also included in the generated [Add ServiceStack Reference](/add-servicestack-reference) DTO's so they're also available in the [Java JsonServiceClient](/java-add-servicestack-reference) and [Swift JsonServiceClient](/swift-add-servicestack-reference). It's also available in our 3rd Party [StripeGateway](https://github.com/ServiceStack/Stripe). Whilst a simple feature, it enables treating your remote services as a message-based API [yielding its many inherent advantages](/advantages-of-message-based-web-services#advantages-of-message-based-designs) where your Application API's need only pass Request DTO models around to be able to invoke remote Services, decoupling the Service Request from its implementation which can be now easily managed by a high-level adapter that takes care of proxying the Request to the underlying Service Client. The adapter could also add high-level functionality of it's own including auto retrying of failed requests, generic error handling, logging/telemetrics, event notification, throttling, offline queuing/syncing, etc. ## File Uploads File uploads can be accessed within Service implementations from the `Request.Files` collection which you can write to the registered [Writable Virtual Files Provider](/virtual-file-system) with: ```csharp [Route("/files/upload")] public class UploadFile {} public class UploadFileService : Service { readonly string UploadsDir = "uploads"; public object Post(UploadFile request) { var uploadedFile = base.Request.Files[0]; VirtualFiles.WriteFile(UploadsDir.CombineWith(uploadedFile.FileName), uploadedFile.InputStream); return new FileUploadResponse { ... }; } } ``` Alternatively [Managed File Uploads](/locode/files-overview) can provide a more effortless solution for configuring custom validation, multiple upload locations and also includes File APIs to access & manage file uploads. ### Uploading File with Request The Service Clients utilize standard [HTTP multipart/form-data](https://www.ietf.org/rfc/rfc2388.txt) Content-Type for uploading files as demonstrated in Talent Blazor's [FileUploadTests.cs](https://github.com/NetCoreApps/TalentBlazor/blob/main/TalentBlazor.Tests/FileUploadTests.cs) which uploads a single attachment when creating a Contact with a Profile Image and multiple file attachments when submitting a Job Application: ```csharp var profileImg = await ProfileImageUrl.GetStreamFromUrlAsync(); var contact = await client.PostFileWithRequestAsync<Contact>(profileImg, "cody-fisher.png", new CreateContact { FirstName = "Cody", LastName = "Fisher", Email = "cody.fisher@gmail.com", JobType = "Security", PreferredLocation = "Remote", PreferredWorkType = EmploymentType.FullTime, AvailabilityWeeks = 1, SalaryExpectation = 100_000, About = "Lead Security Associate", }, fieldName:nameof(CreateContact.ProfileUrl)); // contact.ProfileUrl = /profiles/cody-fisher.png var uploadedImage = await client.BaseUri.CombineWith(contact.ProfileUrl).GetStreamFromUrlAsync(); var coverLetter = new FileInfo($"{AppData}/sample_coverletter.pdf"); var resume = new FileInfo($"{AppData}/sample_resume.pdf"); var attachmentsField = nameof(CreateJobApplication.Attachments); var uploadAttachments = new UploadFile[] { new(coverLetter.Name, coverLetter.OpenRead(), attachmentsField), new(resume.Name, coverLetter.OpenRead(), attachmentsField), new(contact.ProfileUrl.LastRightPart('/'), uploadedImage, attachmentsField), }; var jobApp = await client.PostFilesWithRequestAsync<JobApplication>(new CreateJobApplication { JobId = 1, AppliedDate = DateTime.UtcNow, ContactId = contact.Id, }, uploadAttachments); uploadAttachments.Each(x => x.Stream.Dispose()); ``` This example also shows APIs are able to submit files from any `Stream` that can be sourced from anywhere, including the HTTP Response stream of a Remote URI or files from a local hard drive. ### Using HttpClient MultipartFormDataContent The [.NET 6+ JsonApiClient](/csharp-client#jsonapiclient) lets us provide an even more flexible approach by utilizing `MultipartFormDataContent()` which we've enhanced with high-level extension methods to enable a Fluent API for constructing custom API Requests populated from multiple sources, which can be sent using its `ApiForm*` methods: ```csharp var profileImg = await ProfileImageUrl.GetStreamFromUrlAsync(); using var createContact = new MultipartFormDataContent() .AddParams(new CreateContact { FirstName = "Cody", LastName = "Fisher", Email = "cody.fisher@gmail.com", JobType = "Security", PreferredLocation = "Remote", PreferredWorkType = EmploymentType.FullTime, AvailabilityWeeks = 1, SalaryExpectation = 100_000, About = "Lead Security Associate", }) .AddFile(nameof(CreateContact.ProfileUrl), "cody-fisher.png", profileImg); var contactApi = await client.ApiFormAsync<Contact>(typeof(CreateContact).ToApiUrl(), createContact); // contactApi.Succeeded = true var contact = contactApi.Response!; // contact.ProfileUrl = /profiles/cody-fisher.png using var uploadedImage = await client.BaseUri.CombineWith(contact.ProfileUrl).GetStreamFromUrlAsync(); var coverLetter = new FileInfo($"{AppData}/sample_coverletter.pdf"); var resume = new FileInfo($"{AppData}/sample_resume.pdf"); var attachmentsField = nameof(CreateJobApplication.Attachments); var createJobApp = new MultipartFormDataContent() .AddParams(new CreateJobApplication { JobId = 1, AppliedDate = DateTime.UtcNow, ContactId = contact.Id, }) .AddFile(attachmentsField, coverLetter) .AddFile(attachmentsField, resume) .AddFile(attachmentsField, contact.ProfileUrl.LastRightPart('/'), uploadedImage); var jobAppApi = await client.ApiFormAsync<JobApplication>( typeof(CreateJobApplication).ToApiUrl(), createJobApp); // jobAppApi.Succeeded = true var jobApp = jobAppApi.Response!; ``` ::: tip All `JsonApiClient` Async APIs also have [safe sync equivalents](/csharp-client#safe-sync-httpclient-apis) when access outside an async method is needed ::: ### Upload a single File You can use the `PostFile` API to upload a single File, with the Route of the Service you want to call, the name of the file and the `Stream` of its contents, e.g: ```csharp var client = new JsonServiceClient(baseUrl); using var fileStream = new FileInfo(filePath).OpenRead(); var fileName = "upload.html"; var response = client.PostFile<FileUploadResponse>("/files/upload", fileStream, fileName, MimeTypes.GetMimeType(fileName)); ``` Files uploaded using the `PostFile*` APIs are uploaded as a HTTP POST using the `multipart/form-data` Content-Type which can be accessed from the `IRequest.Files` collection in your Services, e.g: ```csharp [Route("/files/upload")] public class UploadFile {} public class UploadFileService : Service { readonly string UploadsDir = "uploads"; public object Post(UploadFile request) { var uploadedFile = base.Request.Files[0]; VirtualFiles.WriteFile(UploadsDir.CombineWith(uploadedFile.FileName), uploadedFile.InputStream); return new FileUploadResponse { ... }; } } ``` You can use the `PostFileWithRequest` API To also include additional metadata with your File Upload, e.g: ```csharp [DataContract] [Route("/files/upload")] public class FileUpload : IReturn<FileUploadResponse> { [DataMember] public int CustomerId { get; set; } [DataMember] public DateTime CreatedDate { get; set; } } var client = new JsonApiClient(baseUrl); var fileInfo = new FileInfo(filePath); using var fileStream = fileInfo.OpenRead(); var request = new FileUpload { CustomerId = customerId, CreatedDate = fileInfo.CreationTimeUtc, }; var response = client.PostFileWithRequest<FileUploadResponse>( "/files/upload", fileStream, fileInfo.Name, request); ``` ### Multiple File Uploads The `PostFilesWithRequest` APIs available in all .NET Service Clients allow you to easily upload multiple streams within a single HTTP request. It supports populating Request DTO with any combination of QueryString and POST'ed FormData in addition to multiple file upload data streams: ```csharp using var stream1 = uploadFile1.OpenRead(); using var stream2 = uploadFile2.OpenRead(); var client = new JsonServiceClient(baseUrl); var response = client.PostFilesWithRequest<MultipleFileUploadResponse>( "/multi-fileuploads?CustomerId=123", new MultipleFileUpload { CustomerName = "Foo,Bar" }, new[] { new UploadFile("upload1.png", stream1), new UploadFile("upload2.png", stream2), }); ``` Example using only a Typed Request DTO. The `JsonApiClient` also includes async equivalents for each of the `PostFilesWithRequest` APIs: ```csharp using var stream1 = uploadFile1.OpenRead(); using var stream2 = uploadFile2.OpenRead(); var client = new JsonApiClient(baseUrl); var response = await client.PostFilesWithRequestAsync<MultipleFileUploadResponse>( new MultipleFileUpload { CustomerId = 123, CustomerName = "Foo,Bar" }, new[] { new UploadFile("upload1.png", stream1), new UploadFile("upload2.png", stream2), }); ``` ### Versatile Multi Part Content Type APIs [AutoQueryCrudTests.References.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/tests/ServiceStack.WebHost.Endpoints.Tests/AutoQueryCrudTests.References.cs) showcases how we can take advantage of `MultipartFormDataContent` to construct custom requests using a combination of different Content Type sources, including single and multiple file attachments within a single request: ```csharp public class MultipartRequest : IPost, IReturn<MultipartRequest> { public int Id { get; set; } public string String { get; set; } // Complex types sent as JSV by default public Contact Contact { get; set; } [MultiPartField(MimeTypes.Json)] public PhoneScreen PhoneScreen { get; set; } [MultiPartField(MimeTypes.Csv)] public List<Contact> Contacts { get; set; } [UploadTo("profiles")] public string ProfileUrl { get; set; } [UploadTo("applications")] public List<UploadedFile> UploadedFiles { get; set; } } ``` [Complex types are sent using JSV](/serialization-deserialization) by default which is a more human & wrist-friendly and more efficient format than JSON, however we could also take advantage of the flexibility in HTTP **multipart/form-data** requests to construct an HTTP API Request utilizing multiple Content-Type's optimized for the data we're sending, e.g: - JSON/JSV more optimal for hierarchical graph data - CSV more optimal for sending tabular data - File Uploads are more optimal for sending large files To facilitate this in our Server APIs we can use `[MultiPartField]` attribute to instruct ServiceStack which registered serializer it should use to deserialize the form-data payload, whilst we can continue using the generic `[UploadTo]` attribute in normal APIs to handle our File Uploads and populate the Request DTO with the uploaded file metadata. Our `MultipartFormDataContent` extension methods simplifies our client logic by allowing us to easily populate this custom request in a single Fluent construction expression: ```csharp using var content = new MultipartFormDataContent() .AddParam(nameof(MultipartRequest.Id), 1) .AddParam(nameof(MultipartRequest.String), "foo") .AddParam(nameof(MultipartRequest.Contact), new Contact { Id = 1, FirstName = "First", LastName = "Last" }) .AddJsonParam(nameof(MultipartRequest.PhoneScreen), new PhoneScreen { Id = 3, JobApplicationId = 1, Notes = "The Notes"}) .AddCsvParam(nameof(MultipartRequest.Contacts), new[] { new Contact { Id = 2, FirstName = "First2", LastName = "Last2" }, new Contact { Id = 3, FirstName = "First3", LastName = "Last3" }, }) .AddFile(nameof(MultipartRequest.ProfileUrl), "profile.txt", file1Stream) .AddFile(nameof(MultipartRequest.UploadedFiles), "uploadedFiles1.txt", file2Stream) .AddFile(nameof(MultipartRequest.UploadedFiles), "uploadedFiles2.txt", file3Stream)); var api = await client.ApiFormAsync<MultipartRequest>(typeof(MultipartRequest).ToApiUrl(), content); if (!api.Succeeded) api.Error.PrintDump(); ``` ## Capture HTTP Headers in .NET Service Clients A common issue when trying to diagnose service integration issues is wanting to inspect the full HTTP traffic to help identify issues. Inside .NET Applications this would typically require using an external packet sniffer like Fiddler but just like Post Command **raw** HTTP captured output above you can now capture the raw HTTP traffic of all .NET `*ServiceClient` with the new `CaptureHttp()` API. To print HTTP requests to the Console use: ```csharp var client = new JsonServiceClient(BaseUrl); client.CaptureHttp(print:true); var authResponse = client.Send(new Authenticate { provider = "credentials", UserName = "admin", Password = "test" }); ``` Which will print out the raw HTTP Request & Response Headers and body to the Console, e.g: ``` POST /json/reply/Authenticate HTTP/1.1 Host: test.servicestack.net Accept: application/json User-Agent: ServiceStack .NET Client 5.121 Accept-Encoding: gzip,deflate Content-Type: application/json {"provider":"credentials","UserName":"admin","Password":"test"} HTTP/1.1 200 OK Server: nginx/1.18.0, (Ubuntu) Date: Sat, 21 Aug 2021 09:51:34 GMT Transfer-Encoding: chunked Connection: keep-alive Set-Cookie: ss-id=o7VAdXm7JKLy92XiQcQQ; path=/; samesite=strict; httponly, ss-pid=I2MdbrzWZILqNCOqGlyR; expires=Wed, 21 Aug 2041 09:51:34 GMT; path=/; samesite=strict; httponly, ss-opt=temp; expires=Wed, 21 Aug 2041 09:51:34 GMT; path=/; samesite=strict; httponly, X-UAId=2; expires=Wed, 21 Aug 2041 09:51:34 GMT; path=/; samesite=strict; httponly, ss-tok=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiIsImtpZCI6IjNuLyJ9.eyJzdWIiOjIsImlhdCI6MTYyOTUzOTQ5NCwiZXhwIjoxNjMwNzQ5MDk0LCJlbWFpbCI6ImFkbWluQGdtYWlsLmNvbSIsImdpdmVuX25hbWUiOiJGaXJzdCBhZG1pbiIsImZhbWlseV9uYW1lIjoiTGFzdCBhZG1pbiIsIm5hbWUiOiJhZG1pbiBEaXNwbGF5TmFtZSIsInByZWZlcnJlZF91c2VybmFtZSI6ImFkbWluIiwicm9sZXMiOlsiQWRtaW4iXSwianRpIjoxMTR9.rHk-OdCwd8wR4AsT7exLRUr59-mzFs0FvKZUeZhvKMI; expires=Sat, 04 Sep 2021 09:51:34 GMT; path=/; samesite=strict; httponly, ss-reftok=eyJ0eXAiOiJKV1RSIiwiYWxnIjoiSFMyNTYiLCJraWQiOiIzbi8ifQ.eyJzdWIiOjIsImlhdCI6MTYyOTUzOTQ5NCwiZXhwIjoxNjYxMDc1NDk0LCJqdGkiOi02OX0.35MpYdz-QIkbVf98y_wNTA9PIYDy_EEQc3zfkpFvuQc; expires=Sun, 21 Aug 2022 09:51:34 GMT; path=/; samesite=strict; httponly Vary: Accept Access-Control-Allow-Credentials: true Access-Control-Allow-Headers: Content-Type, Allow, Authorization, X-Args Access-Control-Allow-Methods: GET, POST, PUT, DELETE, PATCH, OPTIONS, HEAD X-Powered-By: ServiceStack/5.111 NetCore/Linux X-Cookies: ss-tok,ss-reftok Content-Type: application/json; charset=utf-8 {"userId":"2","sessionId":"o7VAdXm7JKLy92XiQcQQ","userName":"admin","displayName":"admin DisplayName","bearerToken":"eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiIsImtpZCI6IjNuLyJ9.eyJzdWIiOjIsImlhdCI6MTYyOTUzOTQ5NCwiZXhwIjoxNjMwNzQ5MDk0LCJlbWFpbCI6ImFkbWluQGdtYWlsLmNvbSIsImdpdmVuX25hbWUiOiJGaXJzdCBhZG1pbiIsImZhbWlseV9uYW1lIjoiTGFzdCBhZG1pbiIsIm5hbWUiOiJhZG1pbiBEaXNwbGF5TmFtZSIsInByZWZlcnJlZF91c2VybmFtZSI6ImFkbWluIiwicm9sZXMiOlsiQWRtaW4iXSwianRpIjoxMTR9.rHk-OdCwd8wR4AsT7exLRUr59-mzFs0FvKZUeZhvKMI","refreshToken":"eyJ0eXAiOiJKV1RSIiwiYWxnIjoiSFMyNTYiLCJraWQiOiIzbi8ifQ.eyJzdWIiOjIsImlhdCI6MTYyOTUzOTQ5NCwiZXhwIjoxNjYxMDc1NDk0LCJqdGkiOi02OX0.35MpYdz-QIkbVf98y_wNTA9PIYDy_EEQc3zfkpFvuQc","profileUrl":"data:image/svg+xml,...","roles":["Admin"],"permissions":[],"responseStatus":{}} ``` Alternatively you can log it to the debug logger with: ```csharp var client = new JsonServiceClient(BaseUrl); client.CaptureHttp(log:true); ``` Or if preferred you can capture it in a `StringBuilder` to inspect later by disabling clearing it after each request: ```csharp var client = new JsonServiceClient(BaseUrl); client.CaptureHttp(clear:false); ``` Which will begin capturing all HTTP requests made by that client in a `StringBuilder` you can access with: ```csharp client.HttpLog ``` ### ServiceClient URL Resolvers The urls used in all .NET Service Clients are now customizable with the new `UrlResolver` and `TypedUrlResolver` delegates. E.g. you can use this feature to rewrite the URL used with the Request DTO Type Name used as the subdomain by: ```csharp [Route("/test")] class Request {} var client = JsonServiceClient("http://example.org/api") { TypedUrlResolver = (meta, httpMethod, dto) => meta.BaseUri.Replace("example.org", dto.GetType().Name + ".example.org") .CombineWith(dto.ToUrl(httpMethod, meta.Format))); }; var res = client.Get(new Request()); //= http://Request.example.org/api/test var res = client.Post(new Request()); //= http://Request.example.org/api/test ``` This feature is also implemented in `JsonHttpClient`, examples below shows rewriting APIs that use custom urls: ```csharp var client = JsonHttpClient("http://example.org/api") { UrlResolver = (meta, httpMethod, url) => meta.BaseUri.Replace("example.org", "111.111.111.111").CombineWith(url)) }; await client.DeleteAsync<MockResponse>("/dummy"); //=http://111.111.111.111/api/dummy await client.PutAsync<MockResponse>("/dummy", new Request()); //=http://111.111.111.111/api/dummy ``` ## [ServiceStack.Discovery.Consul](https://github.com/wwwlicious/servicestack-discovery-consul) This feature makes it easier to support features like [ServiceStack.Discovery.Consul](https://github.com/wwwlicious/servicestack-discovery-consul) plugin which enables external RequestDTO endpoint discovery by integrating with [Consul.io](http://consul.io) to provide automatic service registration and health checking. ## Built-in Clients All REST and ServiceClients share the same interfaces (`IServiceClient`, `IRestClient` and `IRestClientAsync`) so they can easily be replaced (for increased perf/debuggability/etc) with a single line of code. ### JsonHttpClient The new `JsonHttpClient` is an alternative to the existing generic typed `JsonServiceClient` for consuming ServiceStack Services which instead of using **HttpWebRequest** is based on Microsoft's latest async [HttpClient](https://www.nuget.org/packages/Microsoft.Net.Http). JsonHttpClient implements the full [IServiceClient API](https://gist.github.com/mythz/4683438240820b522d39) making it an easy drop-in replacement for your existing JsonServiceClient where in most cases it can simply be renamed to JsonHttpClient, e.g: ```csharp //IServiceClient client = new JsonServiceClient("https://techstacks.io"); IServiceClient client = new JsonHttpClient("https://techstacks.io"); var response = await client.GetAsync(new GetTechnology { Slug = "servicestack" }) ``` ::: warning As .NET's HttpClient only supports async APIs it needs to use "sync over async" to implement sync APIs **which should be avoided**. If your API needs to make sync API calls it should use .NET 6's `JsonApiClient` or the `JsonServiceClient` instead. ::: #### Install JsonHttpClient can be downloaded from NuGet at: :::copy `<PackageReference Include="ServiceStack.HttpClient" Version="10.*" />` ::: ### Xamarin Native HttpClient Using the default managed `HttpClient` implementation in Xamarin has a [number of issues](https://docs.microsoft.com/en-us/xamarin/cross-platform/macios/http-stack#cons-2) in iOS and Android devices. Xamarin's MSDN docs explain the advantages of native implementations and show how you can enable [native HttpClient implementation for iOS/macOS](https://docs.microsoft.com/en-us/xamarin/cross-platform/macios/http-stack) for your project. If you want to [programmatically enable it for iOS/macOS](https://docs.microsoft.com/en-us/xamarin/cross-platform/macios/http-stack#programmatically-setting-the-httpmessagehandler), you'll likely want to configure it once on the `GlobalHttpMessageHandlerFactory` for all `JsonHttpClient` instances to use, e.g: ```csharp // iOS JsonHttpClient.GlobalHttpMessageHandlerFactory = () => new NSUrlSessionHandler(); ``` Or to only configure it for a specific client you can initialize an instance with: ```csharp // iOS var client = new JsonHttpClient(baseUrl) { HttpMessageHandler = new NSUrlSessionHandler() }; ``` Refer to the [Xamarin MSDN docs for Android HttpClient](https://docs.microsoft.com/en-us/xamarin/android/app-fundamentals/http-stack?tabs=windows) for how to enable it in your project, which can be globally programmatically configured with: ```csharp // Android JsonHttpClient.GlobalHttpMessageHandlerFactory = () => new Xamarin.Android.Net.AndroidClientHandler(); ``` Or per instance with: ```csharp // Android var client = new JsonHttpClient(baseUrl) { HttpMessageHandler = new Xamarin.Android.Net.AndroidClientHandler() }; ``` ### Differences with JsonServiceClient Whilst the goal is to retain the same behavior in both clients, there are some differences resulting from using HttpClient where the Global and Instance Request and Response Filters are instead passed HttpClients `HttpRequestMessage` and `HttpResponseMessage`. Also, all API's are **Async** under-the-hood where any Sync API's that doesn't return a `Task<T>` just blocks on the Async `Task.Result` response. As this can dead-lock in certain environments we recommend sticking with the Async API's unless safe to do otherwise. ### HttpWebRequest Service Clients Whilst the list below contain the built-in clients based on .NET's built-in `HttpWebRequest`: - implements both `IRestClient` and `IServiceClient`: - [JsonServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/JsonServiceClient.cs) (uses default endpoint with **JSON**) - recommended - [JsvServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/JsvServiceClient.cs) (uses default endpoint with **JSV**) - [XmlServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/XmlServiceClient.cs) (uses default endpoint with **XML**) - [CsvServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/CsvServiceClient.cs) (uses default endpoint with **CSV**) - [MsgPackServiceClient](/messagepack-format) (uses default endpoint with **Message-Pack**) - [ProtoBufServiceClient](/protobuf-format) (uses default endpoint with **Protocol Buffers**) - implements `IServiceClient` only: - [Soap11ServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/Soap11ServiceClient.cs) (uses **SOAP 11** endpoint) - [Soap12ServiceClient](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Client/Soap12ServiceClient.cs) (uses **SOAP 12** endpoint) #### Install The HttpWebRequest clients above are available in: :::copy `<PackageReference Include="ServiceStack.Client" Version="10.*" />` ::: # Community Resources - [Reactive ServiceStack](https://gist.github.com/bamboo/5078236) by [@rodrigobamboo](https://twitter.com/rodrigobamboo) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/csharp-server-events-client.md --- slug: csharp-server-events-client title: C# Server Events Client --- Like ServiceStack's other [C# Service Clients](/csharp-client), the new `ServerEventsClient` is a [portable library](https://github.com/ServiceStackApps/HelloMobile) contained in the `ServiceStack.Client` NuGet package: :::copy `<PackageReference Include="ServiceStack.Client" Version="10.*" />` ::: And like the Service Clients it requires the `BaseUri` of your ServiceStack instance as well as an optional `channel` for the client to subscribe to: ```csharp var client = new ServerEventsClient( "http://chat.netcore.io", channels:"home"); ``` ### Managed Connection The **C# ServerEvent Client** is a managed .NET client with feature parity with the [ServiceStack's JavaScript client](https://github.com/ServiceStackApps/Chat#client-bindings---ss-utilsjs) that **auto-reconnects** when a connection is lost, **sends periodic heartbeats** to maintain an active subscription as well as **auto-unregistering** once the client stops listening for messages, or gets disposed. ### Handling Server Events Unlike other C# clients, the ServerEvents Client is mainly reactive in that it's primarily waiting for Server Events to be initiated from a remote server instead of the typical scenario in which requests are initiated by clients. To maximize utility, there are a number of different API's to receive and process messages: ### Assigning Callback Handlers One way to receive messages (useful in long-running clients) is to assign handlers for each of the different events that are fired. This example shows how to capture all the different events a Client can receive: ```csharp ServerEventConnect connectMsg = null; var msgs = new List<ServerEventMessage>(); var commands = new List<ServerEventMessage>(); var errors = new List<Exception>(); var client = new ServerEventsClient(baseUri) { OnConnect = e => connectMsg = e, OnCommand = commands.Add, OnMessage = msgs.Add, OnException = errors.Add, }.Start(); ``` Once the Client is configured, calling `Start()` will start listening for messages and calling `Stop()` or `Dispose()` will cancel the background HTTP connection and stop it listening for server events. ### Customizing Metadata sent to clients As ServerEvents have deep integration with the rest of ServiceStack we're able to offer [Typed Messages](https://github.com/ServiceStack/ServiceStack/blob/71b51d231d1ddb2ba7da39613e216ab75fd181c0/src/ServiceStack.Client/ServerEventsClient.cs#L14-L44) containing the users `UserAuthId`, `DisplayName` and `ProfileUrl` of the users avatar when it's available. The typed messages also offer an extensible `Dictionary<string,string> Meta` collection for maintaining custom metadata that can be sent to clients by appending to them in the ServerEventsFeature hooks, which can be defined when registering `ServerEventsFeature`: ```csharp Plugins.Add(new ServerEventsFeature { // private Connect args OnConnect = (subscription,httpReq) => AppendTo(subscription.Meta), // public Join/Leave args OnCreated = (subscription,httpReq) => AppendTo(subscription.Meta), }) ``` Whilst distinct `OnJoin`, `OnLeave` `OnUpdate` and `OnReconnect` callbacks can be used to handle a specific event, e.g: ```csharp var client = new ServerEventsClient(baseUrl, channel) { OnJoin = msg => ..., OnLeave = msg => ..., OnUpdate = msg => ..., OnReconnect = () => ... }; ``` ### Using C# Async/Await friendly API's Depending on your use-case, if you only want to use the ServerEvent Client for a short-time to listen for predictable responses (i.e. waiting for a Server callback on a pending request) you can alternatively use the Task-based API's letting you to participate in C# async/await workflows: ```csharp var client = new ServerEventsClient(baseUri, channel="Home"); // Wait to receive onConnect event ServerEventConnect connectMsg = await client.Connect(); // Wait to receive onJoin command event ServerEventCommand joinMsg = await client.WaitForNextCommand(); // Hold a future task to get notified once a msg has been received Task<ServerEventMessage> msgTask = client1.WaitForNextMessage(); // Send a Web Service Request using the built-in JsonServiceClient client.ServiceClient.Post(new PostChatToChannel { Channel = client.Channel, // The channel we're listening on From = client.SubscriptionId, // Populated after Connect() Message = "Hello, World!", }); // Wait till we receive the chat Msg event we sent earlier ServerEventMessage msg = await msgTask; ``` The above example showcases the **3 Task-based API's** available: 1. `Connect()` wait till receiving confirmation of a successful event subscription 2. `WaitForNextCommand()` wait for the next `onJoin` or `onLeave` subscription events 3. `WaitForNextMessage()` wait for the next message published to the channel The `ServiceClient` property lets you access a `JsonServiceClient` that's pre-configured with the clients `BaseUri` so that is primed for Sending Web Service Requests with. After the ServerEvent Client has connected, the `ConnectionInfo` property is populated with the typed `ServerEventConnect` response. ### Message Event Handlers The above examples show generic API's for receiving any type of message, but just like in the JavaScript client, more fine-grained API's are available for handling specific message types. The `Handlers` dictionary is akin to the JavaScript Client's [Global Event Handlers](https://github.com/ServiceStackApps/Chat#global-event-handlers) which specify lambda's to be executed when messages are sent with the `cmd.*` selector: ```csharp client.Handlers["chat"] = (client, msg) => { //Deserialize JSON string to typed DTO var chatMsg = msg.Json.FromJson<ChatMessage>(); "Received '{0}' from '{1}'".Print(chatMsg.Message, chatMsg.FromName); }; ``` Roughly translates to the equivalent JavaScript below: ```javascript $(source).handleServerEvents({ handlers: { chat: function (msg, event) { console.log("Received " + msg.message + " from " + msg.fromName); } } }); ``` Where both methods handle the `ChatMessage` sent with the `cmd.chat` selector. ### Named Receivers Whilst handlers provide a light way to handle loose-typed messages, there's a more structured and typed option that works similar to ServiceStack's `IService` classes but are used to instead handle typed Server Event Messages. To be able to handle messages with your own classes, get them to implement the `IReceiver` empty marker interface: ```csharp public interface IReceiver { void NoSuchMethod(string selector, object message); } ``` Whilst primarily a marker interface, `IReceiver` does include a `NoSuchMethod` API to be able to handle messages sent with a unknown selector **target** that doesn't match any defined method or property. **Named Receivers** are equivalent to [Receivers](https://github.com/ServiceStackApps/Chat#receivers) in the JavaScript client which can be assigned to handle all messages sent to a receiver with the selector format: ``` {receiver}.{target} ``` A Named Receiver can be registered with the API below: ```csharp client.RegisterNamedReceiver<TestNamedReceiver>("test"); ``` Which will forward all messages with a `test.*` selector to an instance of the `TestNamedReceiver` Type ```csharp public class TestNamedReceiver : ServerEventReceiver { public void FooMethod(CustomType request) {} // void return type public CustomType BarMethod(CustomType request) { return request; // works with any return type, which are ignored } public CustomType BazSetter { get; set; } // Auto populate properties public override void NoSuchMethod(string selector, object message) { var msg = (ServerEventMessage)message; var nonExistentMethodType = msg.Json.FromJson<CustomType>(); } } ``` This is roughly equivalent to the following JavaScript code: ```javascript $(source).handleServerEvents({ receivers: { test: { FooMethod: function (msg, event) { ... }, BarMethod: function (msg, event) { ... }, BazSetter: null, } } }); ``` > The [ServerEventReceiver](https://github.com/ServiceStack/ServiceStack/blob/68c7159037e7cf2a519d482b7dae524ca073da20/src/ServiceStack.Client/ServerEventsClient.Receiver.cs#L16-L28) is a convenient base class that in addition to implementing `IReceiver` interface, gets injected with the `Client` as well as additional context about the raw message available in `base.Request`. #### Unknown Message Handling One difference in the JavaScript client is that messages with **unknown** targets are assigned as properties on the `test` receiver, e.g `test.QuxTarget = {..}`. ### Sending messages to Named Receivers Once registered, an instance of `TestNamedReceiver` will process messages sent with a `test.*` selector. The example below shows how to send a DTO to each of `TestNamedReceiver` defined methods and properties: ```csharp public class MyEventServices : Service { public IServerEvents ServerEvents { get; set; } public void Any(CustomType request) { ServerEvents.NotifyChannel("home", "test.FooMethod", request); ServerEvents.NotifyChannel("home", "test.BarMethod", request); ServerEvents.NotifyChannel("home", "test.BazSetter", request); ServerEvents.NotifyChannel("home", "test.QuxTarget", request); } } ``` ### Life-cycle of Receivers Similar to **Services** in ServiceStack, each message is processed with an instance of the Receiver that's resolved from `ServerEventsClient.Resolver` which by default uses the [NewInstanceResolver](https://github.com/ServiceStack/ServiceStack/blob/ec0226b97227048c3bd7c24667a71e7af7e1ff31/src/ServiceStack.Client/ServerEventsClient.Receiver.cs#L30-L36) to execute messages using a new instance of the Receiver Type: ```csharp public class NewInstanceResolver : IResolver { public T TryResolve<T>() { return typeof(T).CreateInstance<T>(); } } ``` This can be changed to re-use the same instance by assigning a [SingletonInstanceResolver](https://github.com/ServiceStack/ServiceStack/blob/ec0226b97227048c3bd7c24667a71e7af7e1ff31/src/ServiceStack.Client/ServerEventsClient.Receiver.cs#L38-L46) instead: ```csharp public class SingletonInstanceResolver : IResolver { ConcurrentDictionary<Type, object> Cache = new ConcurrentDictionary<Type, object>(); public T TryResolve<T>() { return (T)Cache.GetOrAdd(typeof(T), type => type.CreateInstance<T>()); } } client.Resolver = new SingletonInstanceResolver(); ``` We can also have it resolve instances from your preferred IOC. Here's an example showing how to register all Receiver Types, auto-wire them with any custom dependencies, and instruct the client to resolve instances from our IOC: ```csharp // Register all Receivers: client.RegisterNamedReceiver<TestNamedReceiver>("test"); ... // Register all dependencies used in a new Funq.Container: var container = new Container(); container.RegisterAs<Dependency, IDependency>(); // Go through an auto-wire all Registered Receiver Types with Funq: container.RegisterAutoWiredTypes(client.ReceiverTypes); // Change the client to resolve receivers from the new Funq Container: client.Resolver = container; ``` We can assign `Funq.Container` directly as it already implements the [IResolver](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Configuration/IResolver.cs) interface, whilst you can re-use the existing IOC **Container Adapters** to [enable support for other IOCs](/ioc#use-another-ioc-container). ### The Global Receiver Whilst Named Receivers are used to handle messages sent to a specific namespaced selector, the client also supports registering a **Global Receiver** for handling messages sent with the special `cmd.*` selector. #### Handling Messages with the Default Selector All `IServerEvents` Notify API's inlcudes [overloads for sending messages without a selector](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/ServerEventsFeature.cs#L743-L771) that by convention will take the format `cmd.{TypeName}`. These events can be handled with a Global Receiver **based on Message type**, e.g: ```csharp public class GlobalReceiver : ServerEventReceiver { public SetterType AnyNamedProperty { get; set; } public void AnyNamedMethod(CustomType request) { ... } } client.RegisterReceiver<GlobalReceiver>(); ``` Which will be called when messages are sent without a selector, e.g: ```csharp public class MyServices : Service { public IServerEvents ServerEvents { get; set; } public void Any(Request request) { ServerEvents.NotifyChannel("home", new CustomType { ... }); ServerEvents.NotifyChannel("home", new SetterType { ... }); } } ``` As Global Receivers handle other messages sent with the `cmd.*` selector and can be re-used as a named receiver, we can define a single class to handle all the different custom messages sent in [chat.netcore.io](http://chat.netcore.io) App, E.g: ``` cmd.chat Hi cmd.announce This is your captain speaking... cmd.toggle#channels css.background-image url(https://servicestack.net/img/bg.jpg) ... ``` The above messages can all be handled with the Receiver below: ```csharp public class JavaScriptReceiver : ServerEventReceiver { public void Chat(ChatMessage message) { ... } public void Announce(string message) { ... } public void Toggle(string message) { ... } public void BackgroundImage(string cssRule) { ... } } client.RegisterReceiver<JavaScriptReceiver>(); client.RegisterNamedReceiver<JavaScriptReceiver>("css"); ``` As seen above the **target** names are **case-insensitive** and `-` are collapsed to cater for JavaScript/CSS naming conventions. ## Event Triggers Triggers enable a pub/sub event model where multiple listeners can subscribe and be notified of an event. Registering an event handler can be done at anytime using the `addListener()` API, e.g: ```csharp Action<ServerEventMessage> handler = msg => { Console.WriteLine($"received event ${msg.Target} with arg: ${msg.Json}"); }; var client = new ServerEventsClient("/", channels) .AddListener("customEvent", handler) .Start(); //Register another listener to 'customEvent' event client.AddListener("customEvent", msg => { ... }); ``` The selector to trigger this custom event is: ``` trigger.customEvent arg trigger.customEvent {json} ``` Which can be sent in ServiceStack with a simple or complex type argument, e.g: ```csharp ServerEvents.NotifyChannel(channel, "trigger.customEvent", "arg"); ServerEvents.NotifyChannel(channel, "trigger.customEvent", new ChatMessage { ... }); ``` #### Removing Listeners Use `RemoveListener()` to stop listening for an event, e.g: ```csharp //Remove first event listener client.RemoveListener("customEvent", handler); ``` ## Channel Subscriber APIs The sync/async APIs below built into the C# `ServerEventsClient` will let you modify an active Server Events subscription to join new or leave existing channels: ```csharp client.UpdateSubscriber(new UpdateEventSubscriber { SubscribeChannels = new[]{ "chan1", "chan2" }, UnsubscribeChannels = new[]{ "chan3", "chan4" }, }); client.SubscribeToChannels("chan1", "chan2"); client.UnsubscribeFromChannels("chan3", "chan4"); await client.SubscribeToChannelsAsync("chan1", "chan2"); await client.UnsubscribeFromChannelsAsync("chan3", "chan4"); ``` ### onUpdate Notification As this modifies the active subscription it also publishes a new **onUpdate** notification to all channel subscribers so they're able to maintain up-to-date info on each subscriber. This can be handled together with **onJoin** and **onLeave** events using `OnCommand`: ```csharp client.OnCommand = msg => ...; //= ServerEventJoin, ServerEventLeave or ServerEventUpdate ``` ## Add Authentication support to .NET ServerEvents Client The explicit `Authenticate` and `AuthenticateAsync` API's can be used to authenticate the ServerEvents ServiceClient which **shares cookies** with the WebRequest that connects to the `/event-stream` so authenticating with the Server Events ServiceClient will also authenticate the `/event-stream` HTTP Connection: ```csharp client.Authenticate(new Authenticate { provider = CredentialsAuthProvider.Name, UserName = "user", Password = "pass", RememberMe = true, }); client.Start(); ``` This is equivalent to: ```csharp client.ServiceClient.Post(new Authenticate { provider = CredentialsAuthProvider.Name, UserName = "user", Password = "pass", RememberMe = true, }); ``` ## Custom Authentication When using a [JWT](/auth/jwt-authprovider) or [API Key](/auth/api-key-authprovider) AuthProvider you can [send it inside a Cookie](/auth/jwt-authprovider#sending-jwt-using-cookies) so it gets sent with client Web Requests. Otherwise you can add the JWT Token or API Key using the `EventStreamRequestFilter` which gets executed before establishing the Server Events connection, e.g: ```csharp new ServerEventsClient(...) { EventStreamRequestFilter = req => req.AddBearerToken(jwt) } ``` Alternatively you can use `ResolveStreamUrl` which will let you modify the URL used to establish the Server Events connection which will also allow you to add the JWT Token to the QueryString as, e.g: ```csharp var sseClient = new ServerEventsClient(baseUrl, channels) { ResolveStreamUrl = url => url.AddQueryParam("ss-tok", JWT), OnConnect = e => { $"{e.IsAuthenticated}, {e.UserId}, {e.DisplayName}".Print(); } }.Start(); ``` This requires that your JWT AuthProvider to accept JWT Tokens via the QueryString which you can enable in ServiceStack's JWT AuthProvider with: ```csharp new JwtAuthProvider { AllowInQueryString = true } ``` To configure API Key AuthProvider to accept API Key in Request Params like QueryString or FormData: ```csharp new ApiKeyAuthProvider { AllowInHttpParams = true } ``` ## Troubleshooting The Server Events Client uses .NET's `HttpWebRequest` internally for its long-running SSE connection and periodic heartbeats where if you're also using other .NET ServiceClients to make API requests back to the same server you'll quickly hit its default limit (2) on number of requests allowed for a single domain which can be increased by changing [ServicePointManager.DefaultConnectionLimit](https://msdn.microsoft.com/en-us/library/system.net.servicepointmanager.defaultconnectionlimit(v=vs.110).aspx), e.g: ```csharp ServicePointManager.DefaultConnectionLimit = maxNumOfConcurrentConnections; ``` # ServerEvent .NET Examples ## [Xamarin.Android Chat](https://github.com/ServiceStackApps/AndroidXamarinChat) Xamarin.Android Chat utilizes the [.NET PCL Server Events Client](/csharp-server-events-client) to create an Android Chat App connecting to the existing [chat.netcore.io](http://chat.netcore.io/) Server Events back-end where it's able to communicate with existing Ajax clients and other connected Android Chat Apps. [![](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/livedemos/xamarin-android-server-events.png)](https://www.youtube.com/watch?v=tImAm2LURu0) > [YouTube Video](https://www.youtube.com/watch?v=tImAm2LURu0) and [AndroidXamarinChat Repo](https://github.com/ServiceStackApps/AndroidXamarinChat) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/grpc/csharp.md --- slug: grpc-csharp title: gRPC protoc C# Client --- [![](/img/pages/grpc/csharp.png)](https://youtu.be/0TXk9y24NIw) ::: info YouTube YouTube: [youtu.be/0TXk9y24NIw](https://youtu.be/0TXk9y24NIw) ::: ## C# protoc generated GrpcServiceClient TodoWorld Example Install [x dotnet tool](/dotnet-tool): :::sh dotnet tool install --global x ::: Create a new C# Console App: :::sh dotnet new console ::: Add required NuGet Packages: :::sh dotnet add package Google.Protobuf ServiceStack.GrpcClient ::: Download TodoWorld SSL Certificate used for its gRPC HTTP/2 Services: :::sh $ x get https://todoworld.servicestack.net/grpc.crt ::: Add protoc generated TodoWorld DTOs and gRPC GrpcServiceClient: :::sh x proto-csharp https://todoworld.servicestack.net ::: Use protoc generated DTOs and `GrpcServiceClient` to call TodoWorld gRPC Service: ### C# smart gRPC GrpcServicesClient Example The smart client registers a ServiceStack Interceptor to enable the richer integration features in ServiceStack Services as found in ServiceStack's Generic `GrpcServiceClient` above, including detailed structured Exception handling, built-in JWT, Session, Credentials Auth support, [Automatically refreshing Access Tokens](/auth/jwt-authprovider#automatically-refreshing-access-tokens), etc. This ServiceStack Interceptor can be registered using `GrpcServiceStack.Client()` when creating the protoc `GrpcServicesClient`: ```csharp using System; using System.Security.Cryptography.X509Certificates; using System.Threading.Tasks; using ServiceStack; namespace TodoWorld { class Program { static async Task Main(string[] args) { var client = new GrpcServices.GrpcServicesClient( GrpcServiceStack.Client("https://todoworld.servicestack.net:50051", new X509Certificate2("grpc.crt"), GrpcUtils.AllowSelfSignedCertificatesFrom("todoworld.servicestack.net"))); var response = await client.GetHelloAsync(new Hello { Name = "gRPC C#" }); Console.WriteLine(response.Result); } } } ``` Override `Program.cs` with the above C# Example: :::sh npx add-in todoworld-csharp-smart ::: Run example: :::sh dotnet run ::: ### protoc-only generated Service Client Alternatively clients can opt to use the vanilla protoc generated ServiceClient without any dependency to **ServiceStack.GrpcClient** which will fallback to gRPC's default behavior of basic error handling with any auth headers needing to be populated manually. Add required core package dependencies: :::sh dotnet add package Grpc.Core Grpc.Net.Client ::: ### C# protoc gRPC insecure Example ```csharp using System; using System.Threading.Tasks; using Grpc.Core; using Grpc.Net.Client; namespace TodoWorld { class Program { static async Task Main(string[] args) { AppContext.SetSwitch("System.Net.Http.SocketsHttpHandler.Http2UnencryptedSupport", true); var client = new GrpcServices.GrpcServicesClient( GrpcChannel.ForAddress("http://todoworld.servicestack.net:5054")); var response = await client.GetHelloAsync(new Hello { Name = "gRPC C#" }); Console.WriteLine(response.Result); } } } ``` Override `Program.cs` with the above C# Example: :::sh npx add-in todoworld-csharp ::: Run example: :::sh dotnet run ::: ### C# protoc gRPC SSL Example ```csharp using System; using System.Linq; using System.Net.Security; using System.Security.Cryptography.X509Certificates; using System.Threading.Tasks; using Grpc.Core; using Grpc.Net.Client; namespace TodoWorld { class Program { static async Task Main(string[] args) { var client = new GrpcServices.GrpcServicesClient( GrpcChannel.ForAddress("https://todoworld.servicestack.net:50051", new GrpcChannelOptions { HttpClient = new System.Net.Http.HttpClient(new System.Net.Http.HttpClientHandler { ClientCertificates = { new X509Certificate2("grpc.crt") }, ServerCertificateCustomValidationCallback = (req, cert, certChain, sslErrors) => cert.SubjectName.RawData.SequenceEqual(cert.IssuerName.RawData) && // self-signed cert.GetNameInfo(X509NameType.DnsName, forIssuer:false) == "todoworld.servicestack.net" && (sslErrors & ~sslErrors.RemoteCertificateChainErrors) == sslErrors.None // only this }) })); var response = await client.GetHelloAsync(new Hello { Name = "gRPC C#" }); Console.WriteLine(response.Result); } } } ``` Override `Program.cs` with the above C# Example: :::sh npx add-in todoworld-csharp-ssl ::: Run example: :::sh dotnet run ::: ### C# Local Development gRPC SSL CRUD Example ```csharp using System; using System.Threading.Tasks; using Grpc.Net.Client; using TodoWorld; namespace CSharp { public class Program { static async Task Main(string[] args) { // Certificate registration not required when using trusted local development certificate var client = new GrpcServices.GrpcServicesClient(GrpcChannel.ForAddress("https://localhost:5001")); await client.PostResetTodosAsync(new ResetTodos()); //POST /todos var todo = (await client.PostCreateTodoAsync(new CreateTodo { Title = "ServiceStack" })).Result; Console.WriteLine($"new todo Id: {todo.Id}, Title: {todo.Title}"); //GET /todos var all = await client.CallGetTodosAsync(new GetTodos()); Console.WriteLine($"todos: {all.Results?.Count ?? 0}"); //GET /todos/1 todo = (await client.CallGetTodoAsync(new GetTodo { Id = todo.Id })).Result; Console.WriteLine($"get todo Id: {todo.Id}, Title: {todo.Title}"); //GET /todos all = await client.CallGetTodosAsync(new GetTodos()); Console.WriteLine($"todos: {all.Results?.Count ?? 0}"); //PUT /todos/1 await client.PutUpdateTodoAsync(new UpdateTodo { Id = todo.Id, Title = "gRPC" }); //GET /todos/1 todo = (await client.CallGetTodoAsync(new GetTodo { Id = todo.Id })).Result; Console.WriteLine($"updated todo Title: {todo.Title}"); //DELETE /todos/1 await client.CallDeleteTodoAsync(new DeleteTodo { Id = todo.Id }); //GET /todos all = await client.CallGetTodosAsync(new GetTodos()); Console.WriteLine($"todos: {all.Results?.Count ?? 0}"); } } } ``` Refer to [/clients/csharp](https://github.com/NetCoreApps/todo-world/tree/master/clients/csharp) for a complete example project. ### More Examples For more protoc generated `GrpcServices.GrpcServiceClient` examples check out the unit tests at: - [ProtocTests.cs](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Extensions.Tests/Protoc/ProtocTests.cs) - [ProtocTodoTests.cs](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Extensions.Tests/Protoc/ProtocTodoTests.cs) - [ProtocAuthTests.cs](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Extensions.Tests/Protoc/ProtocAuthTests.cs) - [ProtocServerEventsTests.cs](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Extensions.Tests/Protoc/ProtocServerEventsTests.cs) - [ProtocDynamicAutoQueryTests.cs](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.Extensions.Tests/Protoc/ProtocDynamicAutoQueryTests.cs) --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/csv-format.md --- title: CSV Format --- The [CSV format](http://en.wikipedia.org/wiki/Comma-separated_values) is a first-class supported format which means all your existing web services can automatically take accept and return CSV without any config or code changes. ### Importance of CSV CSV is an important format for transferring, migrating and quickly visualizing data as all spreadsheets support viewing and editing CSV files directly whilst its supported by most RDBMS support exporting and importing data. Compared with other serialization formats, it provides a compact and efficient way to transfer large datasets in an easy to read text format. ### Speed The CSV Serializer used was developed using the same tech that makes [ServiceStack's JSV and JSON serializers fast](http://www.servicestack.net/benchmarks/NorthwindDatabaseRowsSerialization.100000-times.2010-08-17.html) (i.e. no run-time reflection, static delegate caching, etc), which should make it the fastest CSV serializer available for .NET. ### Downloadable Separately The `CsvSerializer` is maintained in the [ServiceStack.Text](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack.Text) project which can be downloaded from NuGet at: :::copy `<PackageReference Include="ServiceStack.Text" Version="10.*" />` ::: ### How to register your own custom format with ServiceStack Registering a custom format is done by registering the Format's Content-Type with to your AppHost's `ContentTypes` API, e.g: ```csharp //Register the 'text/csv' content-type format //Note: Format is inferred from the last part of the content-type, e.g. 'csv' public class CsvFormat : IPlugin { public void Register(IAppHost appHost) { appHost.ContentTypes.Register(MimeTypes.Csv, SerializeToStream, CsvSerializer.DeserializeFromStream); //ResponseFilter to add 'Content-Disposition' header for browsers to open in Spreadsheet appHost.GlobalResponseFilters.Add((req, res, dto) => { if (req.ResponseContentType == MimeTypes.Csv) { var fileName = req.OperationName + ".csv"; res.AddHeader(HttpHeaders.ContentDisposition, $"attachment;{HttpExt.GetDispositionFileName(fileName)}"); } }); } void SerializeToStream(IRequest req, object request, Stream stream) => CsvSerializer.SerializeToStream(request, stream); } ``` We recommend encapsulating Custom Formats registrations into a [Plugin](/plugins) as done with the built-in [CsvFormat](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Formats/CsvFormat.cs) which is added by default: ```csharp Plugins.Add(new CsvFormat()); //added by default ``` Which makes it easy to register, detect and remove. E.g. to remove built-in support for CSV you can just remove it from the `Plugins` collection: ```csharp Plugins.RemoveAll(x => x is CsvFormat); ``` The ability to automatically to register another format and provide immediate value and added functionality to all your existing web services (without any code-changes or configuration) we believe is a testament to ServiceStack's clean design of using strongly-typed 'message-based' DTOs to let you develop clean, testable and re-usable web services. No code-gen or marshalling is required to bind to an abstract method signature, every request and calling convention maps naturally to your Web Services DTOs. ## Usage The CSV format is effectively a first-class supported format so everything should work as expected, including being registered as an available format on ServiceStack's metadata index page: * [/metadata](https://northwind.netcore.io/metadata) And being able to preview the output of a service: * [/csv/metadata?op=CustomerDetails](https://northwind.netcore.io/csv/metadata?op=CustomerDetails) By default they are automatically available using ServiceStack's standard calling conventions, e.g: * [/csv/reply/Customers](https://northwind.netcore.io/csv/reply/Customers) ### REST Usage CSV also works just as you would expect with user-defined REST-ful urls, i.e. you can append `?format=csv` to specify the format in the url e.g: * [/customers?format=csv](https://northwind.netcore.io/customers?format=csv) This is how the above web service output looks when opened up in [google docs](https://spreadsheets.google.com/pub?key=0AjnFdBrbn8_fdDBwX0Rha04wSTNWZDZlQXctcmp2bVE&hl=en_GB&output=html) Alternative in following with the HTTP specification you can also specify content-type `"text/csv"` in the *Accept* header of your HttpClient as done in [HTTP Utils](/http-utils) extension methods: ```csharp var csv = "http://nortwind.netcore.io/customers".GetCsvFromUrl(); ``` ## CSV Deserialization Support The introduction of the new [AutoQuery Data](/autoquery/data) feature and it's `MemorySource` has made full CSV support a lot more appealing which caused CSV Deserialization support where it's implementation is now complete. This now unlocks the ability to create fully-queryable Services over flat-file .csv's (or Excel spreadsheets exported to .csv) by just deserializing CSV into a List of POCO's and registering it with AutoQuery Data: ```csharp var pocos = File.ReadAllText("path/to/data.csv").FromCsv<List<Poco>>(); //AutoQuery Data Plugin Plugins.Add(new AutoQueryDataFeature() .AddDataSource(ctx => ctx.MemorySource(pocos))); // AutoQuery DTO [Route("/pocos")] public class QueryPocos : QueryData<Poco> {} ``` ### Super CSV Format A noteworthy feature that sets ServiceStack's CSV support apart is that it's built on the compact and very fast [JSV format](/jsv-format) which not only can deserialize a tabular flat file of scalar values at high-speed, it also supports deeply nested object graphs which are encoded in JSV and escaped in a CSV field as normal. An example of this can be seen in a HTTP sample log fragment below where the HTTP Request Headers are a serialized from a `Dictionary<string,string>`: ```csv Id,HttpMethod,AbsoluteUri,Headers 1,GET,http://localhost:55799,"{Connection:keep-alive,Accept:""text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8"",Accept-Encoding:""gzip, deflate, sdch"",Accept-Language:""en-US,en;q=0.8"",Host:""localhost:55799"",User-Agent:""Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.112 Safari/537.36"",Upgrade-Insecure-Requests:1}" ``` Being such a versatile file format opens up a lot of new possibilities, e.g. instead of capturing seed data in code you could maintain them in plain-text .csv files and effortlessly load them on App Startup, e.g: ```csharp using (var db = container.Resolve<IDbConnectionFactory>().Open()) { if (db.CreateTableIfNotExists<Country>()) //returns true if Table created { List<Country> countries = "~/App_Data/countries.csv".MapHostAbsolutePath() .ReadAllText().FromCsv<List<Country>>(); db.InsertAll(countries); } } ``` ### All Services now accept CSV Content-Types Another immediate benefit of CSV Deserialization is that now all Services can now process the CSV Content-Type. Being a tabular data format, CSV shines when it's processing a list of DTO's, one way to do that in ServiceStack is to have your Request DTO inherit `List<T>`: ```csharp [Route("/pocos")] public class Pocos : List<Poco>, IReturn<Pocos> { public Pocos() {} public Pocos(IEnumerable<Poco> collection) : base(collection) {} } ``` It also behaves the same way as CSV Serialization but in reverse where if your Request DTO is annotated with either `[DataContract]` or the more explicit `[Csv(CsvBehavior.FirstEnumerable)]` it will automatically deserialize the CSV into the first `IEnumerable` property, so these 2 Request DTO's are equivalent to above: ```csharp [Route("/pocos")] [DataContract] public class Pocos : IReturn<Pocos> { [DataMember] public List<Poco> Items { get; set; } } [Route("/pocos")] [Csv(CsvBehavior.FirstEnumerable)] public class Pocos : IReturn<Pocos> { public List<Poco> Items { get; set; } } ``` In addition to the above flexible options for defining CSV-friendly Services, there's also a few different options for sending CSV Requests to the above Services. You can use the CSV `PostCsvToUrl()` extension methods added to [HTTP Utils](/http-utils): ```csharp string csvText = File.ReadAllText("pocos.csv"); //Send CSV Text List<Poco> response = "http://example.org/pocos" .PostCsvToUrl(csvText) .FromCsv<List<Poco>>(); //Send POCO DTO's List<Poco> dtos = csvText.FromCsv<List<Poco>>(); List<Poco> response = "http://example.org/pocos" .PostCsvToUrl(dtos) .FromCsv<List<Poco>>(); ``` Alternatively you can use the `CsvServiceClient` which has the nice Typed API's you'd expect from a Service Client: ```csharp var client = new CsvServiceClient(baseUrl); Pocos response = client.Post(new Pocos(dtos)); ``` ### Ideal for Auto Batched Requests The `CsvServiceClient` by virtue of being configured to use a well-defined Tabular data format is perfect for sending [Auto-Batched Requests](/auto-batched-requests) which by definition send a batch of POCO's making the CSV format the most compact text format to send them with: ```csharp var requests = new[] { new Request { ... }, new Request { ... }, new Request { ... }, }; var responses = client.SendAll(requests); ``` ## Limitations As most readers familiar with the CSV format will know there are some inherent limitations with CSV-format namely it is a flat-structured tabular data format that really only supports serialization of a single resultset. This limitation remains, although if you decorate your Response DTO with a `[Csv(CsvBehavior.FirstEnumerable)]` or standard .NET `[DataContract]/[DataMember]` attributes the CSV Serializer will change to use the following conventions: * If you only return one result in your DTO it will serialize that. * If you return multiple results it will pick the first IEnumerable<> property or if it doesn't exist picks the first property. * Non-enumerable results are treated like a single row. Basically if you only return 1 result it should work as expected otherwise it will chose the best candidate based on the rules above. The second major limitation is that it doesn't yet include a CSV Deserializer (currently on the TODO list), so while you can view the results in CSV format you can't post data to your web service in CSV and have it automatically deserialize for you. You can however still upload a CSV file and parse it manually yourself. # Features Unlike most CSV serializers that can only serialize rows of primitive values, the CsvSerializer uses the [JSV format](/jsv-format) under the hood so even [complex types](https://spreadsheets.google.com/pub?key=0AjnFdBrbn8_fdG83eWdGM1lnVW9PMlplcmVDYWtXeVE&hl=en_GB&output=html) will be serialized in fields in a easy to read format - no matter how deep its hierarchy. --- # Source: https://raw.githubusercontent.com/ServiceStack/docs.servicestack.net/refs/heads/main/MyApp/_pages/vue/custom-autoforms.md --- title: Custom Auto Forms group: Component Gallery --- ## Custom AutoForm UIs [CoffeeShop's Admin UI](https://servicestack.net/posts/building-typechat-coffeeshop-modelling) is a good example of the rapid development model of AutoQuery and Vue's [AutoQueryGrid](/vue/autoquerygrid) and [Auto Form](/vue/autoform) Components was nearly able to develop the entire CRUD management UI using just AutoQuery's Typed DTOs. The one Form that it wasn't able to generate the entire UI for is its **Many-to-Many** `CategoryOption` relationship which requires a custom AutoForm component to be able to specify which Options a category of CoffeeShop Products can have. <div class="not-prose"> <div class="mb-16 flex justify-center"> <iframe style="width:896px;height:504px;" src="https://www.youtube.com/embed/MjNqPAXLH5w?si=HDFs2FnYhtuZSDWL&start=404" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> </div> </div> ### Implementing Many to Many CategoryOption Admin UI The easier way to implement this functionality would be to have the UI call an API each time an `Option` was added or removed to a `Category`. The problem with this approach is that it doesn't match the existing behavior where if a User **cancels** a form they'd expect for none of their changes to be applied. To implement the desired functionality we'll instead create a custom `UpdateCategory` implementation that also handles any changes to `CategoryOption` using new `AddOptionIds` and `RemoveOptionIds` properties that we'll want rendered as **hidden** inputs in our HTML Form with: ```csharp public class UpdateCategory : IPatchDb<Category>, IReturn<Category> { public int Id { get; set; } public string? Name { get; set; } public string? Description { get; set; } [Input(Type = "tag"), FieldCss(Field = "col-span-12")] public List<string>? Sizes { get; set; } [Input(Type = "tag"), FieldCss(Field = "col-span-12")] public List<string>? Temperatures { get; set; } public string? DefaultSize { get; set; } public string? DefaultTemperature { get; set; } [Input(Type = "file"), UploadTo("products")] public string? ImageUrl { get; set; } [Input(Type = "hidden")] public List<int>? AddOptionIds { get; set; } [Input(Type = "hidden")] public List<int>? RemoveOptionIds { get; set; } } ``` ## Custom AutoQuery Implementation The [Custom AutoQuery Implementation](/autoquery/rdbms#custom-autoquery-implementations) in [CoffeeShopServices.cs](https://github.com/NetCoreApps/TypeChatExamples/blob/main/TypeChatExamples.ServiceInterface/CoffeeShopServices.cs) contains the custom implementation which continues to utilize AutoQuery's **Partial Update** functionality if there's any changes to update, as well as removing or adding any Options the user makes to the `Category`: ```csharp public class CoffeeShopServices(IAutoQueryDb autoQuery) : Service { public async Task<object> Any(UpdateCategory request) { // Perform all RDBMS Updates within the same Transaction using var trans = Db.OpenTransaction(); Category? response = null; var ignore = new[]{nameof(request.Id),nameof(request.AddOptionIds),nameof(request.RemoveOptionIds)}; // Only call AutoQuery Update if there's something to update if (request.ToObjectDictionary().HasNonDefaultValues(ignoreKeys:ignore)) { response = (Category) await autoQuery.PartialUpdateAsync<Category>(request, Request, Db); } if (request.RemoveOptionIds?.Count > 0) { await Db.DeleteAsync<CategoryOption>(x => x.CategoryId == request.Id && request.RemoveOptionIds.Contains(x.OptionId)); } if (request.AddOptionIds?.Count > 0) { await Db.InsertAllAsync(request.AddOptionIds.Map(id => new CategoryOption { CategoryId = request.Id, OptionId = id })); } trans.Commit(); response ??= request.ConvertTo<Category>(); return response; } } ``` ## Custom AutoForm Component It now needs to implement a Custom UI that Adds/Removes Options from a Category which is done in a custom `CategoryOptions` Vue Component that displays all the Category Options with a button to remove existing ones and a Select Input to add non existing options. The purpose of the component is to populate the `addOptionIds` field with Option Ids that should be added and `removeOptionIds` with Ids to be removed, which updates the Request DTO of the parent Form Model with the `update:modelValue` event: ```js const CategoryOptions = { template:` <div> <ul v-for="optionType in currentOptionTypes"> <li class="py-1 flex justify-between"> <span> {{optionType}} </span> <span> <svg class="w-6 h-6 text-red-600 hover:text-red-800 cursor-pointer" @click="removeOption(optionType)" xmlns='http://www.w3.org/2000/svg' width='1024' height='1024' viewBox='0 0 1024 1024'> <title>Remove Option
Add Option
`, props:['type','id','modelValue'], emits:['update:modelValue'], setup(props, { emit }) { const client = useClient() const options = ref([]) const model = props.modelValue model.addOptionIds ??= [] model.removeOptionIds ??= [] const origOptionIds = model.categoryOptions?.map(x => x.optionId) || [] const currentOptionIds = computed(() => [...origOptionIds, ...model.addOptionIds] .filter(x => !model.removeOptionIds.includes(x))) const currentOptionTypes = computed(() => currentOptionIds.value.map(id => options.value.find(x => x.id === id)?.type).filter(x => !!x)) function addOption(e) { const optionType = e.target.value if (!optionType) return const option = options.value.find(x => x.type === optionType) if (model.removeOptionIds.includes(option.id)) model.removeOptionIds = model.removeOptionIds.filter(id => id !== option.id) else if (!model.addOptionIds.includes(option.id)) model.addOptionIds.push(option.id) emit('update:modelValue', model) } function removeOption(optionType) { const option = options.value.find(x => x.type === optionType) if (model.addOptionIds.includes(option.id)) model.addOptionIds = model.addOptionIds.filter(id => id !== option.id) else if (!model.removeOptionIds.includes(option.id)) model.removeOptionIds.push(option.id) } onMounted(async () => { const api = await client.api(new QueryOptions({ orderBy:'id' })) options.value = api.response.results || [] emit('update:modelValue', model) }) return { options, addOption, removeOption, currentOptionTypes } } } ``` Which is then attached to the AutoQueryGrid Form Components using its `