Step Backward Feature Selection
It is the reverse of Step Forward Feature Selection, and as you may have guessed this time, it starts with the entire set of features and works backward from there iteratively.

It begins by building a machine learning model with all of the features, then creates new subsets by removing every feature, one at a time, and training a new machine learning model. It evaluates each subset to find the best one, then continues the process of removing a new single feature that results in the best model by calculating the performance of the model.
Basically, the features are removed in a round-robin manner.
The process stops when an additional feature is removed and the performance doesn’t decrease past a certain arbitrary threshold.
Reference: