The herein coined PADOC (perturbed-analytic direct transcription for optimal control) stands for a new transcription method for direct trajectory optimization. We construct PADOC on a novel segmented decomposition method that provides series solutions for nonlinear problems. To transcribe the infinite dimensional problem into a finite one, PADOC comes along with a new solution approach, namely, the herein coined average nonlinear programming (aNLP). The aNLP theorem suggests generating a staircase optimal solution (low resolution) and turning it into a distributed optimal solution (high resolution) by exploiting the Hamiltonian of the problem. The analytic architecture provides PADOC with an analytic connection of a truncated order between the discrete nodes of the solution. This renders stability, accuracy within an analytic resolution, robustness, and a cut-down in the number of the decision variables in the frame of NLP. We also prove that the multipliers associated with the Karush-Kuhn-Tucker optimality conditions in the frame of NLP (as transcribed by PADOC) correspond to a backward analytic solution for the costate equations. We finally show distinct features of PADOC through some examples.