Last-mile performance tuning
Fine-tuning deep neural networks (DNNs) significantly improves performance on domain specific neural search tasks. However, fine-tuning for neural search is not trivial, as it requires a combination of expertise in ML and Information Retrieval.
Finetuner significantly increases the performance of pretrained models on domain specific neural search applications.
Simple yet powerful
Interacting with Finetuner is simple and supports selections of different loss functions.
Fine-tune in the cloud
Never again worry about provisioning cloud resources! Finetuner handles all related complexity and infrastructure.
How does it work?
import finetuner from docarray import DocumentArray # Login to Jina ecosystem finetuner.login() # Prepare training data train_data = DocumentArray(...) # Fine-tune in the cloud run = finetuner.fit( model='resnet50', train_data=train_data, epochs=5, batch_size=128, ) print(run.name) print(run.logs()) # When ready run.save_artifact(directory='experiment')
Load your training data: text, image, video, audio, matrix, tensor, or any Jina-compatible data format.
Choosing your model
Select a backbone model from our supported DNNs that you’d like to fine-ture.
Configure the number of epochs, batch size and other configurable hyper-parameters.
Monitor the logs
At any time, retrieve your experiments and runs and monitor their logs.
Finetuned neural network
A turned DNN that gives much better search relevance, which can be directly used as an Executor in Jina.