This paper explores the issue of input randomization in decision tree ensembles for time series classification. We suggest an unsupervised discretization method to create diverse discretized datasets.We introduce a novel ensemble method, in which each decision tree is trained on one dataset from the pool of different discretized datasets created by the proposed discretization method. As the discretized data has a small number of boundaries the decision tree trained on it is forced to learn on these boundaries. Different decision trees trained on datasets having different discretization boundaries are diverse. The proposed ensembles are simple but quite accurate. We study the performance of the proposed ensembles against the other popular ensemble techniques. The proposed ensemble method matches or outperforms Bagging, and is competitive with Adaboost.M1 and Random Forests.