The Energy Release Rate in the General Delamination Problem of Anisotropic Laminates Under Thermomechanical Loads

[+] Author and Article Information
Wan-Lee Yin

School of Civil and Environmental Engineering, Georgia Institute of Technology, Atlanta, GA 30332-0355

J. Appl. Mech 65(1), 85-92 (Mar 01, 1998) (8 pages) doi:10.1115/1.2789051 History: Received January 06, 1997; Revised June 30, 1997; Online October 25, 2007


A simple formulation, directly in terms of the local variables of the laminated plate theory, is used to derive the general expression of the energy release rate at a boundary point of an arbitrarily shaped delamination in a multilayered anisotropic laminate under combined mechanical and temperature loads. The intact and de-bonded sublaminates are modeled using the first-order shear deformation theory. If the thermoelastic constitutive equations of the sublaminates are linear and uncoupled, then the expression of the energy release rate may be reduced to a simple form depending only on the sublaminate stiffness coefficients and the local values of the midplane strains and curvatures. The expression does not explicitly involve the temperature load, and is also independent of the strain and curvature parameters tangential to the delamination front. The corresponding expression for delamination in classical laminated plates is also given. The results are applied to the problem of a laminated strip with a fully developed edge delamination loaded under axial extension, bending, and twisting.

Copyright © 1998 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.





Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In