The development of smart wearable devices has driven the rapid progress of activity recognition. However, existing activity recognition methods are still struggling to recognize single arm swings due to coarse-grained sensor data segmentation. Refined arm-swing-wise data segmentation is vital in some specific cases, such as the rehabilitation of disabled patients. In this paper, we propose a smartwatch-based arm-swing-wise data segmentation approach for human activity recognition, which converts original sensor signals into square-wave signals to detect the cut-off points of each arm swing. Particularly, our method can adaptively adjust the window size and step size of a sliding window without considering the change of swing speed. Empirical evaluation on two datasets, a self-collected dataset and a publicly-available benchmark dataset, shows superior performance of our approach over other methods under different settings, such as classifiers, features, and wearing positions.