1. What is KNN?
2. How does KNN work?
3. Distance Metrics
4. Choosing K
5. Classification
6. Regression
7. Optimal K
8. Advantages
9. Disadvantages
10. Lazy Algorithm
11. KNN vs KMeans
12. Normalization
13. High Dimensions
14. Scaling
15. Real-time Usage
16. Weighted KNN
17. Imbalanced Data
18. Time Complexity
19. KNN in Python
20. Applications
| No. | Question | Answer |
|---|---|---|
| 1 | What is K-Nearest Neighbors (KNN)? | KNN is a supervised machine learning algorithm used for classification and regression. |
| 2 | How does KNN work? | KNN works by storing training data and predicting the class of a new point based on its nearest neighbors using distance metrics. |
| 3 | Which distance metrics are used in KNN? | Common metrics: Euclidean, Manhattan, Minkowski, Cosine. |
| 4 | How to choose the value of K? | Use cross-validation and pick K that balances bias vs variance. |
| 5 | How is KNN used for classification? | New data points are assigned the majority class among neighbors. |
| 6 | How is KNN used for regression? | Prediction is based on the average value of neighbors. |
| 7 | How do we find the optimal K value? | Use cross-validation. Small K → overfitting. Large K → underfitting. |
| 8 | What are the advantages of KNN? | Easy to implement, non-parametric, works for both classification & regression. |
| 9 | What are the disadvantages of KNN? | Slow prediction with large datasets, sensitive to irrelevant features. |
| 10 | Why is KNN called a lazy learning algorithm? | Because it stores data and delays learning until prediction. |
| 11 | What is the difference between KNN and KMeans? | KNN: Supervised, classification. KMeans: Unsupervised, clustering. |
| 12 | Why is normalization important in KNN? | Because features with larger ranges can dominate distance calculations. |
| 13 | What happens in high dimensions? | Curse of dimensionality: Distance metrics lose effectiveness. |
| 14 | Why is feature scaling needed in KNN? | To ensure all features contribute equally to distance. |
| 15 | Can KNN be used in real-time systems? | Yes, but optimization (KD-trees, Ball-trees) is required. |
| 16 | What is Weighted KNN? | Neighbors are weighted by distance → closer points have higher influence. |
| 17 | How does KNN handle imbalanced data? | Use resampling or weighted voting. |
| 18 | What is the time complexity of KNN? | Training: O(1). Prediction: O(n × d). |
| 19 | How is KNN implemented in Python? |
Using scikit-learn:
KNeighborsClassifier(), KNeighborsRegressor().
|
| 20 | What are the applications of KNN? | Recommendation systems, image recognition, medical diagnosis, anomaly detection. |
0 Comments
I’m here to help! If you have any questions, just ask!