This is something I take for granted. DRY principle: Don't Repeat Yourself. It's not that simple though. It gets tricky when the number of repetitions is small or unpredictable.
A counterargument to DRY principle is that abstraction libraries are more expensive that embedding the complexity into higher level code. This argument holds as far as the high-level code doesn't cross maintainability threshold.
Another counterargument is that of iterative prototyping. Prototypes are built top-down most of the time. Abstractions are built bottom-up. Building abstractions before everything else will cause reordering of priorities. The development process will start looking more like traditional waterfall model where everything falls in place at the end of the schedule.
Prototype can be thrown away at any point in time, rendering the abstractions useless. Worse yet, abstraction layers can push other items down the priority list, which can make the prototype look unpromising. Abstractions can actually cause quick death of the prototype.
The key insight that kills all these objections to DRY is that libraries need to be developed incrementally just like applications. Many libraries start as a bunch of innocent helper methods. Useful helper methods get developed into class libraries that are used in multiple projects within one company. Internal libraries in areas of general interest are later replaced by public libraries with their superior documentation, mind share, and quality.
So how does this translate to day-to-day software development? Initially all complexity should be embedded into higher layers of code. Once the high-level code gets too verbose or too complicated, a few helper methods will go a long way for a long time. No need for a full-blown library. Helper methods can be expanded where justified by needs of higher-level code that uses them. Public libraries are then introduced where internal libraries grow out of control.