Page 1 of 1

Building an AI using Pascal.

Posted: Wed Apr 01, 2026 9:31 pm
by ONiX
Yes it is possible!

According to Gemini, can GPT 5.1 Codex - you can develop an AI using Pure Pascal...

High-Level Overview
To create a functional AI (like a Multi-Layer Perceptron) in Pure Pascal, you need to build four core components:

Data Structures: Matrix and Vector representations using arrays and pointers.
Linear Algebra Engine: Procedures for matrix multiplication and addition.
Activation Functions: Mathematical functions like Sigmoid or Tanh.
The Learning Algorithm: Implementation of Forward Propagation and Backpropagation (Gradient Descent).

1. Handling Memory Constraints
Turbo Pascal's biggest hurdle is the 64KB segment limit for static arrays. A large neural network requires many "Weights" (floating-point numbers).

The Issue: A Real type in Turbo Pascal takes 6 bytes. So, an array of 100x100 weights would exceed 60KB, nearly hitting the limit for a single data structure.
The Solution: Use Pointers and the Heap. By allocating memory dynamically, you can utilize the full 640KB (or even XMS/EMS if you wrote a driver, though we'll stick to the Heap).

Code: Select all

type
  PWeightArray = ^TWeightArray;
  TWeightArray = array[1..1000] of Real; { Dynamic allocation is safer }
2. Defining the "Neuron" Structure
In pure Pascal, you should define a Layer object or record that holds weights, biases, and the resulting activations.

Code: Select all

type
  PLayer = ^TLayer;
  TLayer = record
    Units: Integer;
    Weights: PWeightArray; { Flattened 2D array }
    Biases: PWeightArray;
    Outputs: PWeightArray;
    Gradients: PWeightArray;
  end;
3. The Math: Linear Algebra
Since you have no libraries, you must write the matrix multiplication loop. This is the "Engine" of your AI.

Key Concept: Dot Product The output of a neuron is the sum of its inputs multiplied by their respective weights, plus a bias.

Code: Select all

function DotProduct(Inputs, Weights: PWeightArray; Len: Integer): Real;
var
  i: Integer;
  Sum: Real;
begin
  Sum := 0;
  for i := 1 to Len do
    Sum := Sum + Inputs^[i] * Weights^[i];
  DotProduct := Sum;
end;
4. Activation Functions
AI needs non-linearity. The Sigmoid function is the classic choice for Turbo Pascal because it is relatively easy to compute using the built-in Exp function.

Code: Select all

function Sigmoid(X: Real): Real;
begin
  Sigmoid := 1.0 / (1.0 + Exp(-X));
end;

{ Derivative needed for training }
function SigmoidDerivative(X: Real): Real;
begin
  SigmoidDerivative := X * (1.0 - X);
end;
5. Training: Backpropagation
This is the most complex part. You must:

Forward Pass: Calculate the output based on inputs.
Calculate Error: Subtract the actual output from the target.
Backward Pass: Use the chain rule to calculate how much each weight contributed to the error.
Update Weights: Subtract a small fraction of the gradient (Learning Rate) from the weights.

Ozz

Re: Building an AI using Pascal.

Posted: Wed Apr 01, 2026 9:35 pm
by ONiX
Challenges and Best Practices
1. Numeric Precision
Turbo Pascal's Real type (6-byte) has limited precision. For better results, use the {$N+} compiler directive to enable the 8087 Floating Point Emulator (or coprocessor) and use the Double or Extended types.

Code: Select all

{$N+} { Enable 8087 numeric support }
type
  Float = Double; { 8-byte precision }
2. Performance
Nested loops in Pascal are fast, but without a GPU, training will be slow.

Optimization: Avoid calling Exp too often. Use look-up tables (pre-calculated arrays) for activation functions if speed is an issue.
Randomization: Initialize your weights with small random values using Randomize and Random. If weights start at zero, the network will never learn.

3. Data I/O
Since you are in DOS, you must handle your own data loading. You will need to write procedures to read binary files of training data (like a simplified MNIST) and procedures to Assign, Rewrite, and BlockWrite your trained weights to disk so you don't lose them when you turn off the computer.

Ozz

Re: Building an AI using Pascal.

Posted: Wed Apr 01, 2026 9:37 pm
by ONiX
Suggested Project: The XOR Problem
Don't try to build a Chatbot or Image Generator first. In Turbo Pascal, start with the XOR Problem.

Inputs: (0,0), (0,1), (1,0), (1,1)
Expected Outputs: 0, 1, 1, 0
Structure: 2 Inputs, 2 Hidden Neurons, 1 Output Neuron.

Summary of the "Pure Pascal" Workflow
Initialize: Allocate memory for weights on the Heap.
Randomize: Fill weights with values between -1 and 1.
Loop:
Feed an input.
Calculate activations.
Compare to target.
Apply Backpropagation.
Verify: Test with inputs the model hasn't seen.
Building this in Turbo Pascal is an excellent way to understand the "Silicon" level of Artificial Intelligence. You will realize that AI is essentially just a massive, iterated series of additions and multiplications.

Ozz

Re: Building an AI using Pascal.

Posted: Wed Apr 01, 2026 9:43 pm
by ONiX

Code: Select all

O:\PascalScriptEngine\DevPas>DevPas XORProblem.pas
0, 1, 1, 0
Finished.

Re: Building an AI using Pascal.

Posted: Wed Apr 01, 2026 10:18 pm
by ONiX
April Fools Work:
1. Higher-Level API (Keras-style)
• Introduce layer configuration records (activation, kernel init, bias init, dropout rate, etc.).
• Provide builder functions like `DenseLayer`, `ConvLayer`, `RNNLayer`, returning configured `PLayer` instances.
• Add `TModelConfig`/`TOptimizerConfig` objects for wiring loss functions, learning rates, and gradient updates.
• Create `TTrainer` object to handle `Compile`, `Train`, `Evaluate`, and `Predict` workflows with batching.

2. Model Zoo / Presets
• Define factory functions: `CreateMLP`, `CreateSimpleCNN`, `CreateMiniRNN`, `CreateTinyTransformer`.
• Each preset builds a `TSequential` or `TFunctional` graph of layers, with predefined dimensions.
• Expose presets via a catalog unit (e.g., `KModelZoo`) with descriptive comments.

3. Utilities (Visualization, Logging, Checkpoints, Profiling)
• Tensor visualization: add routines to print tensors as matrices/heatmaps or export to CSV.
• Logging: simple `KLogger` unit that timestamps and writes layer outputs, losses, and metrics to a text file.
• Checkpoints: `KCheckpoint` unit to serialize weights/bias arrays to `.bin` files and restore them.
• Profiling: timed execution around `Forward` calls using `GetTickCount`, storing stats per layer.
• Combine into a `KUtils` unit with helper procedures like `DumpTensor`, `LogMetric`, `SaveState`, `LoadState`, `StartProfile`, `StopProfile`.

Next steps for implementation:
1. Expand `KLayer` to include configuration-driven initializers and optimizer hooks.
2. Create new units (`KTrainer`, `KModelZoo`, `KUtils`) implementing the features above.
3. Build demos/tests for each preset and utility (logging to file, saving/loading checkpoints, visualizing an input tensor).

Let me know which part you’d like to implement first or if you’d like me to start coding one of the modules.