Skip to content

Commit 9507555

Browse files
MaxiDonkey@hotmail.comMaxiDonkey@hotmail.com
authored andcommitted
up to date verion 1.2.0
1 parent 4ee8b3c commit 9507555

22 files changed

Lines changed: 3546 additions & 896 deletions

Changelog.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,42 @@
1+
#### 2025, August 12 version 1.2.0
2+
**Modifications to ensure full use of the gtp-5 model**
3+
4+
- JSON Normalization Before Deserialization
5+
- New `GenAI.API.Normalizer` module (`TJSONNormalizer`, `TWrapKind`, `TNormalizationRule`) to unify polymorphic fields (e.g., string vs. object).
6+
- Direct integration in the HTTP layer: new `Get(..., Path)` overloads allow targeted normalization of a JSON subtree before object mapping.
7+
8+
- Canceling Background Requests
9+
- New `Responses.AsyncAwaitCancel(response_id)` method to cancel an asynchronous (`background = true`) response, with full callback support (`OnStart`, `OnSuccess`, `OnError`).
10+
11+
- Streaming Enhancements
12+
- Extended typed coverage for streaming events and outputs (MCP, Code Interpreter, Image Generation, etc.) via new `Responses.OutputParams` classes (`TResponseOutput*`, `TResponseImageGenerationTool`, `TResponseCodeInterpreter`, etc.).
13+
14+
- New Types and Parameters
15+
- InputParams: full coverage for computer interactions, local shell, MCP, web search, code, image generation, reasoning, text/JSON formats, tool choice/hosted tool, and file search filters.
16+
- OutputParams: states (`Created`, `InProgress`, etc.), events (`Added`, `Delta`), usage metrics, and statistics.
17+
- New enums (`TOutputIncluding`, `TReasoningGenerateSummary`, `TFidelityType`, etc.).
18+
19+
- API `v1/chat/completions`
20+
- New parameters:
21+
- `prompt_cache_key` (prompt caching)
22+
- `safety_identifier` (stable ID for safety monitoring)
23+
- `verbosity` (low/medium/high)
24+
25+
- API `v1/responses`
26+
- New parameters:
27+
- `max_tool_calls`
28+
- `prompt` (template reference via `TPromptParams`)
29+
- `prompt_cache_key`, `safety_identifier`
30+
- `stream_options`, `top_logprobs`, `verbosity`
31+
32+
- Structured System and Developer Messages
33+
- New overloads:
34+
- `TMessagePayload.Developer(const Content: TArray; const Name: string = '')`
35+
- `TMessagePayload.System(const Content: TArray; const Name: string = '')`
36+
- Improves parity between plain text and structured content flows.
37+
38+
<br>
39+
140
### 2025, June 14 version 1.1.0 (**Getit version**)
241
- Given the project’s rapid progress, it’s now essential to embed versioning directly into the GenAI wrapper’s source code. For any client implementing the IGenAI interface, the version number can be retrieved via the Version property, for example:
342
```Delphi

ChatCompletion.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -112,7 +112,7 @@ The Chat API can be used for both single-turn requests and multi-turn, stateless
112112
// procedure (E: Exception)
113113
// begin
114114
// Display(TutorialHub, E.Message);
115-
// end
115+
// end);
116116
```
117117
<br>
118118

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,13 +5,13 @@ ___
55
![GitHub](https://img.shields.io/badge/IDE%20Version-Delphi%2010.4/11/12-ffffba)
66
[![GetIt – Available](https://img.shields.io/badge/GetIt-Available-baffc9?logo=delphi&logoColor=white)](https://getitnow.embarcadero.com/genai-optimized-openai-integration-wrapper/)
77
![GitHub](https://img.shields.io/badge/platform-all%20platforms-baffc9)
8-
![GitHub](https://img.shields.io/badge/Updated%20on%20June%2013,%202025-blue)
8+
![GitHub](https://img.shields.io/badge/Updated%20on%20August%2012,%202025-blue)
99

1010
<br>
1111

1212
NEW:
1313
- Getit current version: 1.1.0
14-
- [Changelog v1.1.0](https://github.com/MaxiDonkey/DelphiGenAI/blob/main/Changelog.md)
14+
- [Changelog v1.2.0](https://github.com/MaxiDonkey/DelphiGenAI/blob/main/Changelog.md)
1515
- [Mini-lab to experiment with the v1/responses endpoint `File2knowledge`](https://github.com/MaxiDonkey/file2knowledge)
1616
- [Responses endpoint](https://github.com/MaxiDonkey/DelphiGenAI/blob/main/Responses.md)
1717
___
@@ -75,7 +75,7 @@ ___
7575
**GenAI** is a powerful Delphi library that brings the latest OpenAI APIs to your desktop, mobile, and server apps.
7676

7777
**Core capabilities**
78-
- Unified access to text, vision, speech, and audio endpoints
78+
- Unified access to text, vision, speech and audio endpoints
7979
- Agentic workflows via the `v1/responses` endpoint, with built-in tools `file_search`, `web_search`, `Code Interpreter`, and `remote MCP`
8080
- Supports state-of-the-art models, including ***gpt-4o***, ***gpt-4.1***, ***gpt-4.5*** and the reasoning-centric *o1 · o3 · o4* series
8181

Responses.md

Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@
1313
- [Get a model response](#get-a-model-response)
1414
- [Delete a model response](#delete-a-model-response)
1515
- [List input items](#list-input-items)
16+
- [Canceling Background Tasks](#canceling-background-tasks)
1617
- [Vision](#vision)
1718
- [Analyze single source](#analyze-single-source)
1819
- [Analyze multi-source](#analyze-multi-source)
@@ -621,6 +622,62 @@ Returns a list of input items for a given response. Refer to the [official docum
621622

622623
___
623624

625+
### Canceling Background Tasks
626+
627+
Terminates an ongoing model response identified by its ID. This action is only available for responses that were initiated with the `background` parameter set to `true`.
628+
629+
Some reasoning models—such as Codex and Deep Research—can take several minutes to solve complex problems. Background mode allows you to run long-running operations on models like o3 and o1-pro more reliably, avoiding timeouts and network interruptions.
630+
631+
When background mode is enabled, the task starts asynchronously. You can then periodically poll the response object to monitor its progress. To start a background process, send an API request with `background` set to `true`.
632+
633+
And, of course, you can cancel a background task, as shown below.
634+
635+
<br>
636+
637+
```pascal
638+
//uses GenAI, GenAI.Types, GenAI.Tutorial.VCL;
639+
640+
TutorialHub.JSONRequestClear;
641+
642+
//Asynchronous promise example
643+
var Promise := Client.Responses.AsyncAwaitCancel('Response_ID');
644+
645+
Promise
646+
.&Then<TResponse>(
647+
function (Value: TResponse): TResponse
648+
begin
649+
Result := Value;
650+
Display(TutorialHub, Value);
651+
end)
652+
.&Catch(
653+
procedure (E: Exception)
654+
begin
655+
Display(TutorialHub, E.Message);
656+
end);
657+
658+
//Asynchronous example
659+
// Client.Responses.AsynCancel('Response_ID',
660+
// function : TAsynResponse
661+
// begin
662+
// Result.Sender := TutorialHub;
663+
// Result.OnStart := Start;
664+
// Result.OnSuccess := Display;
665+
// Result.OnError := Display;
666+
// end);
667+
668+
//Synchronous example
669+
// var Value := Client.Responses.Cancel('Response_ID');
670+
// try
671+
// Display(TutorialHub, Value);
672+
// finally
673+
// Value.Free;
674+
// end;
675+
```
676+
677+
<br>
678+
679+
___
680+
624681
## Vision
625682

626683
Refer to the [official documentation](https://platform.openai.com/docs/guides/vision).

0 commit comments

Comments
 (0)