OpenBLAS WARNING - could not determine the L2 cache size on this system, assuming 256k

I am trying to create an API with Serverless, where I download a pickled machine learning model in memory. However, it works offline but not in the AWS Lambda function.
This is my code. What can I do? I am lost.

def predict(event, context):
body = {
“message”: “Ok.”,
s3 = boto3.resource(‘s3’)

with BytesIO() as file:
    s3.Bucket("ml-models").download_fileobj('model_6.pkl', file)
    model = joblib.load(file)

What would be a solution to load the machine learning model that is saved in an S3 bucket?