1.9 KiB
tags, aliases, date, time, description
| tags | aliases | date | time | description |
|---|---|---|---|---|
| 2024-11-10 | 17:00:12 |
可以用來代替scikit-learn
I know, Scikit-Learn isn’t supposed to be a deep learning library, but people use it as if it were. It is incredibly handy at quick prototyping and traditional machine learning models, but when it comes to neural networks, it’s just not in the same league as a library designed with tensors in mind.
Why scikit-learn is Overrated:
No GPU Support: Deep learning can be life-changing when training on GPUs. However, this is something that is not supported in Scikit-Learn.
Not Optimized for Neural Networks: Scikit-learn wasn’t designed for doing deep learning; using it this way is reactively assured poor results.
What You Should Use Instead: PyTorch
PyTorch is more general and supports GPU. Hence, it’s perfect for deep learning projects. It’s Pythonic-this means for one coming from Scikit-Learn, it will feel natural, but with much more power.
import torch
import torch.nn as nn
import torch.optim as optim
Define a simple model
model = nn.Sequential(
nn.Linear(10, 5),
nn.ReLU(),
nn.Linear(5, 2)
)
Define optimizer and loss
optimizer = optim.SGD(model.parameters(), lr=0.01)
loss_fn = nn.CrossEntropyLoss()
If you’re serious about deep learning, you’ll want to use a library worked out for the task at hand-which will save you from such limitations and inefficiencies. You will fine tune models with PyTorch and leverage the GPUs to your heart’s content.