< Back to the main thesis page
Writing Secure Java Code:
A Taxonomy of Heuristics and an Evaluation of Static Analysis Tools
(May 2008)
The taxonomy below is briefly described in a paper I wrote for the NIST SAMATE Static Analysis Workshop (2008). Definitions for the design principles included in the taxonomy can be found here. For complete discussions of them, please read my thesis.

I've also written a few custom FindBugs detectors for enforcing a few of these heuristics.

Key:
bookmarkDesign Principle
wizardDesign Heuristic
button_okCoding Heuristic
Expand - Collapse
Correct Modules
The principle of correct modules states that designs with modules that meet specifications and that are made by following safe programming practices are better.

Understandability
The principle of understandability states that designs with modules that are easier to understand are better.

Reduced Complexity
The principle of reduced complexity [8] states that designs with modules that are less complex are better.

Economy of Mechanism
The principle of economy of mechanism [1] states that designs with modules that are simple and small are better.

Minimized Security Elements
The principle of minimized security elements [8] states that designs that have a minimal number of security-critical modules are better.

Isolated Security Elements
The principle of isolated security elements states that designs that isolate modules that provide security-critical functionality are better.

Small Modules
The principle of small modules [10] states that designs with small modules are better.

Information Hiding
The principle of information hiding [10, 12] states that designs with modules that shield the details of their internal structure and processing from other modules are better.

Low Coupling
The principle of low coupling [10] states that designs that minimize the degree of connection between pairs of modules are better.

High Cohesion
The principle of high cohesion [10] states that designs that maximize the degree to which a module’s parts are related to one another are better.

Continuous Protection of Information
The principle of continuous protection of information [8] states that designs that perform operations that continuously protect sensitive information in every system state are better.

Secure Defaults
The principle of secure defaults [8] states that designs with modules that have secure initial configurations are better.

Secure Initialization
The principle of secure initialization states that designs with modules that perform initialization functionality in a secure manner are better.

Strong Protection Mechanisms
The principle of strong protection mechanisms states that designs with modules that provide protection in the most secure manner possible are better.

Defense in Depth
The principle of defense in depth [3] states that designs that establish protective barriers across multiple dimensions of a module are better.

Fail-safe Defaults
The principle of fail-safe defaults [1] states that designs with modules that deny access to objects unless entities have been granted explicit access permissions are better [4].

Self Analysis
The principle of self analysis [8] states that designs with modules that assess their own internal state are better.

Least Privilege
The principle of least privilege [1] states that designs with modules that do not have access to unneeded resources are better [10].

Complete Mediation
The principle of complete mediation [1] states that designs with modules that check every access to sensitive objects for authorization are better.

Separation of Privilege
The principle of separation of privilege [1] states that designs with modules that require two or more conditions to be met before an action is permitted are better.

Trust Boundaries
The principle of trust boundaries [5] states that designs that clearly establish domains of trust between interacting modules that exchange or manipulate data are better.

Reluctance to Trust
The principle of reluctance to trust [9, 50] states that it is better to have designs with modules that assume all interactions with external entities are malicious.

Least Common Mechanism
The principle of least common mechanism [1] states that designs with modules that minimize the number of shared access paths to information are better.

Controlled Sharing
The principle of controlled sharing states that designs whose modules share information in a controlled manner are better.

Secure Transfer
The principle of secure transfer states that designs with modules that protect information that is transferred outside complete control of a software system are better.

Secure Failure
The principle of secure failure [8] states that designs with modules that do not jeopardize security when failures occur are better.

Secure Disposal
The principle of secure disposal [45] states that designs with modules that dispose of sensitive information when it is no longer needed are better.

Secure Shutdown
The principle of secure shutdown [45] states that designs with modules that do not jeopardize security when performing shutdown or termination functionality are better.

Accountability
The principle of accountability [8] states that designs that record security-relevant sequences of actions and trace them to the entity (e.g., module or user) that caused the actions to occur are better.

Bibliography

[1] M. D. Schroeder, and J. H. Saltzer, “The Protection of Information in Computer Systems,” in Proceedings of the IEEE, vol. 63, no. 9, 1975, pp. 1278-1308. Available: http://web.mit.edu/Saltzer/www/publications/protection.

[2] Sun Microsystems, Inc., "Secure Coding Guidelines for the Java Programming Language, version 2.0," Sun Microsystems, Inc. [Online]. Available: http://java.sun.com/security/seccodeguide.html. [Accessed: Aug. 30, 2007].

[3] S. Redwine, Jr., Ed., Software Assurance: A Guide to the Common Body of Knowledge to Produce, Acquire, and Sustain Secure Software, Workforce Education and Training Working Group, U.S. Department of Homeland Security, Draft Version 1.1, September 2006.

[4] M. Bishop, Introduction to Computer Security. Boston, MA: Addison-Wesley, 2005.

[5] M. Howard and D. Lipner, Writing Secure Code, 2nd ed. Redmond, Washington: Microsoft Press, 2003.

[6] G. McGraw, "Software Security," IEEE Software and Privacy, vol. 2, no. 2, pp. 80-83, March/April 2004.

[7] S. Redwine, Jr. and N. Davis, Eds., Processes to Produce Secure Software: Towards more Secure Software, National Cyber Security Summit, Software Process Subgroup of the Task Force on Security across the Software Development Lifecycle, vol. 1, March 2004.

[8] T. V. Benzel, C. E. Irvine, T. E. Levin, G. Bhaskara, and P. C. Nguyen, “Design Principles for Security,” Naval Postgraduate School : Monterey, California, Tech Rep. NPS-CS-05-010, September 2005. Available: http://handle.dtic.mil/100.2/ADA437854. [Accessed: Sept. 5, 2007].

[9] J. Viega and G. McGraw, Building Secure Software: How to Avoid Security Problems the Right Way. Indianapolis, IN: Addison-Wesley, 2002.

[10] C. Fox, Introduction to Software Engineering Design: Processes, Principles, and Patterns with UML2. Boston, MA: Addison-Wesley, 2006.

[11] J. Bloch, Effective Java Programming Language Guide. Prentice Hall, 2001.

[12] D. L. Parnas, “On the Criteria To Be Used in Decomposing Systems into Modules,” in Communications of the ACM, vol. 15, no. 12, 1972, pp. 1053-1058.

[13] M. Graff and K. van Wyk, Secure Coding: Principles and Practices. Sebastopol, CA: O'Reilly, 2003.

[14] Sun Microsystems, Inc., “White Paper: The Java Language Environment,” Sun Microsystems, Inc., 1997. [Online]. Available: http://java.sun.com/docs/white/langenv/Security.doc5.html. [Accessed: October 1, 2006].

[15] G. McGraw and E. Felton, Java Security: Hostile Applets, Holes, and Antidotes. Canada: Wiley Computer Publishing, 1997.

[16] Sun Microsystems, Inc. “News and Updates: Chronology of security-related bugs and issues,” Sun Microsystems, Inc., 2002. [Online]. Available: http://java.sun.com/sfaq/chronology.html. [Accessed: October 4, 2006].

[17] L. Gong, G. Ellison, and M. Dageforde, Inside Java 2 Platform Security: Architecture, API Design, and Implementation, 2nd ed. Boston, MA: Addison-Wesley, 2003.

[18] J. Gosling, B. Joy, G. Steele, and G. Bracha, “The Java Language Specification,” 3rd ed. Sun Microsystems, Inc. [Online]. Available: http://java.sun.com/docs/books/jls. [Accessed: December 11, 2007].

[19] Sun Microsystems, Inc. “Security Managers and the Java SE SDK,” Rev. 1.7, Sun Microsystems, Inc.. [Online]. Available: http://java.sun.com/javase/6/docs/technotes/guides/security/smPortGuide.html. [Accessed: March 13, 2008].

[20] L. Gong, “Java 2 Platform Security Architecture,” Ver. 1.2, Sun Microsystems, Inc. 1997-2002. [Online]. Available: http://java.sun.com/javase/6/docs/technotes/guides/security/spec/security-spec.doc.html. [Accessed: March 13, 2008].

[21] Sun Microsystems, Inc. “Default Policy Implementation and Policy File Syntax,” Rev. 1.6, Sun Microsystems, Inc. [Online]. Available: http://java.sun.com/javase/6/docs/technotes/guides/security/PolicyFiles.html. [Accessed: March 11, 2008].

[22] D. Dean, E. Felton, D. Wallach, “Java Security: Web Browsers and Beyond,” in Proceedings of the 1996 IEEE Symposium on Security and Privacy (SP '96), 1996.

[23] G. McGraw and E. Felton, Securing Java: Getting Down to Business with Mobile Code. John Wiley & Sons, Inc., 1998. [Online]. Available: http://www.securingjava.com. [Accessed: March 13, 2008].

[24] R. Alexander, J. Bieman, and J. Viega, “Coping with Java Programming Stress,” IEEE Computer, vol. 33, no. 4, pp. 30-38, April 2000.

[25] G. McGraw, Software Security: Building Security In. Boston, MA; Addison-Wesley, 2006.

[26] The MITRE Corporation, “Common Weaknesses Enumeration: A Community-Developed Dictionary of Software Weakness Types,” Draft 7, The MITRE Corporation, 2007. [Online]. Available: http://cwe.mitre.org. [Accessed: October 28, 2007].

[27] Sun Microsystems, Inc. “API for Privileged Blocks,” Rev. 1.6, Sun Microsystems, Inc. [Online]. Available: http://java.sun.com/javase/6/docs/technotes/guides/security/doprivileged.html. [Accessed: October 5, 2007].

[28] Fortify Software Inc., “Fortify Taxonomy: Software Security Errors,” Fortify Software Inc., 2006. [Online]. Available: http://www.fortifysoftware.com/vulncat. [Accessed October 4, 2006].

[29] M. Howard, D. LeBlanc, and J. Viega, 19 Deadly Sins of Software Security: Programming Flaws and How to Fix Them. Emeryville, CA: McGraw-Hill/Osborne, 2005.

[30] R. P. Abbott, J. S. Chin, J. E. Donnelley, W. L. Konigsford, S. Tokubo, and D. A. Webb, D. “Security Analysis and Enhancements of Computer Operating Systems,” Institute for Computer Sciences and Technology, National Bureau of Standards, Tech. Rep. NBSIR 76-1041, April 1976. Available: http://cwe.mitre.org/about/sources.html.

[31] R. Bisbey II and D. Hollingsworth, “Protection Analysis: Final Report,” CA: University of Southern California Information Sciences Institute, Tech. Rep. ISI/RR-78-13, 1978.

[32] C. Landwehr, A. Bull, J. McDermott, and W. Choi, “A Taxonomy of Computer Program Security Flaws, with Examples,” in ACM Computing Surveys (CSUR), vol. 26, no. 3, September 1994, pp. 211-254.

[33] M. Bishop and D. Bailey, “A Critical Analysis of Vulnerability Taxonomies,” University of California at Davis, Tech. Rep. CSE-96-11, September 1996.

[34] S. Weber, P. Karger, and A. Paradkar, “A Software Flaw Taxonomy: Aiming Tools At Security,” in ACM SIGSOFT Software Engineering Notes, vol. 30, no. 4, July 2005.

[35] Open Web Application Security Project, “The free and open application security community,” [Online]. Available: http://www.owasp.org. [Accessed: March 1, 2008].

[36] CERT, “CERT Statistics,” Software Engineering Institute: Carnegie Mellon. [Online]. Available: http://www.cert.org/stats. [Accessed: October 3, 2007].

[37] CERT, “Secure Coding,” Software Engineering Institute: Carnegie Mellon. [Online]. Available: http://www.cert.org/secure-coding. [Accessed: December 12, 2006].

[38] ISO/IEC JTC 1/SC 22/OWG: Vulnerabilities, “Guidance for Avoiding Vulnerabilities through Language Selection and Use,” [Online]. Available: http://www.aitcnet.org/isai. [Accessed: October 24, 2007].

[39] K. Arnold, J. Gosling, D. Holmes, The Java Programming Language, 4th ed., Upper Saddle River, NJ: Addison-Wesley, 2005. [Online]. Available: http://proquest.safaribooksonline.com/0321349806. [Accessed: September 25, 2007].

[40] D. Piliptchouk, “Java vs. .NET Security – Part 3,” O’Reilly ONJava.com, [Online]. Available: http://www.onjava.com/pub/a/onjava/2004/01/28/javavsdotnet.html?page=2. [Accessed: October 15, 2007].

[41] R. Seacord, “Secure Coding Standards,” in Proceedings of the Static Analysis Summit, NIST Special Publication 500-262, July 2006. Available: http://samate.nist.gov/docs/NIST_Special_Publication_500-262.pdf.

[42] S. McConnell, Code Complete, 2nd ed. Redmond, Washington: Microsoft Press, 2004.

[43] B. Chess and J. West, Secure Programming with Static Analysis. Boston, MA: Addison-Wesley, 2007.

[44] K. Goertzel Ed., T. Winograd, H. McKinley, P. Holley, Security in the Software Lifecycle: Making Software Development Processes – and Software Produced by Them – More Secure, US Department of Homeland Security, Draft Version 1.2, August 2006.

[45] S. Redwine Jr., “Towards an Organization for Software System Security Principles and Guidelines,” Institute for Infrastructure and Information Assurance, James Madison University: Harrisonburg, VA, Tech. Rep. 08-01, Version 1.0, February 2008.

[46] K. Tsipenyuk, B. Chess, G. McGraw, “Seven Pernicious Kingdoms: A Taxonomy of Software Security Errors,” in NIST Workshop on Software Security Assurance Tools, Techniques, and Metrics, Long Beach, CA, November 2005.

[47] A. Carzaniga, G. Picco, G. Vigna, “Is Code Still Moving Around? Looking Back at a Decade of Code Mobility,” in Companion to the proceedings of the 29th International Conference on Software Engineering, 2007, pp. 9-20.

[48] CLASP, “Comprehensive Lightweight Application Security Process,” Secure Software, Inc., Version 2.0, 2006. [Online]. Available: http://searchsoftwarequality.techtarget.com/searchAppSecurity/downloads/clasp_v20.pdf. [Accessed December 5, 2007].

[49] A. J. Riel, Object-Oriented Design Heuristics. Addison-Wesley, 1996.

[50] S. Barnum and M Gegick, “Reluctance to Trust,” Build Security In: Setting a Higher Standard for Software Assurance, Cigital Inc., 2005. [Online]. Available: https://buildsecurityin.us-cert.gov/daisy/bsi/articles/knowledge/principles/355.html. [Accessed: January 17, 2008].

[51] C. Lai, “Java Insecurity: Accounting for Subtleties That Can Compromise Code,” IEEE Software, vol. 25, no. 1, pp. 13-19, January/February 2008.

[52] S. Liang, The Java Native Interface: Programmer’s Guide and Specification, Palo Alto, CA: Sun Microsystems Inc., 2002. [Online]. Available: http://java.sun.com/docs/books/jni/html/titlepage.html.

[53] S. Barnum and M Gegick, “Design Principles,” Build Security In: Setting a Higher Standard for Software Assurance, Cigital Inc., 2005. [Online]. Available: https://buildsecurityin.us-cert.gov/daisy/bsi/articles/knowledge/principles/358.html?branch=1&language=1. [Accessed: January 17, 2008].

[54] CERT, “Top 10 Secure Coding Practices,” Software Engineering Institute: Carnegie Mellon. [Online]. Available: https://www.securecoding.cert.org/confluence/display/seccode/Top+10+Secure+Coding+Practices. [Accessed: February, 16, 2008].

[55] Apache Software Foundation, “Logging Services: log4j,” Apache Software Foundation, [Online]. Available: http://logging.apache.org/log4j/1.2/index.html. [Accessed: February 18, 2008].

[56] Sun Microsystems, Inc. “Java SE 6,” Sun Microsystems, Inc. [Online]. Available: http://java.sun.com/javase/6.

[57] Sun Microsystems, Inc. “Java EE at a Glance,” Sun Microsystems, Inc. [Online]. Available: http://java.sun.com/javaee.

[58] Howard and Lipner, The Security Development Lifecycle: SDL: A Process for Developing Demonstrably More Secure Software, Microsoft Press, June 2006.[Online]. Available: http://proquest.safaribooksonline.com/0735622140. [Accessed: January 28, 2008].

[59] B. Chess and G. McGraw, “Static Analysis for Security,” IEEE Security and Privacy, pp. 32-35, vol. 2, no. 6, pp. 76-79, November/December 2004.

[60] O. Burn, “Checkstyle 4.4,” [Online]. Available: http://checkstyle.sourceforge.net. [Accessed: March 1, 2008].

[61] S. Gutz and O. Marquez, “TPTP Static Analysis Tutorial Part 1: A Consistent Analysis Interface,” [Online]. Available: http://www.eclipse.org/tptp/home/documents/process/development/static_analysis/TPTP_static_analysis_tutorial_part1.html. [Accessed: February 10, 2008].

[62] D. Hovemeyer and W. Pugh, “Finding Bugs is Easy,” in SIGPLAN Notices, vol. 39, no. 12, December 2004, pp. 92-206.

[63] Fortify Software Inc., “Fortify Source Code Analysis (SCA),” Fortify Software Inc. [Online]. Available: http://www.fortify.com/products/sca.

[64] utils.com, “Lint4j,” [Online]. Available: http://www.jutils.com.

[65] InfoEther, “PMD,” [Online]. Available: http://pmd.sourceforge.net.

[66] QJ-Pro, “Code Analyzer for Java,” [Online]. Available: http://qjpro.sourceforge.net.

[67] R. Martin, S. Barnum, S. Christey, “Being Explicit about Security Weaknesses,” presented at Black Hat DC 2007, 2007. Available: http://cwe.mitre.org/about/documents.html.

[68] The SANS Institute, “SANS Software Security Institute,” The SANS Institute. [Online]. Available: http://www.sans-ssi.org. [Accessed: March 7, 2008].

[69] National Institute of Standards and Technology, “SAMATE: Software Assurance Metrics and Tool Evaluation,” National Institute of Standards and Technology. [Online]. Available: http://samate.nist.gov. [Accessed: February 5, 2008].

[70] Jlint, “About Jlint,” [Online]. Available: http://jlint.sourceforge.net.

[71] N. Rutar, C. Almazan, and J. Foster, “A Comparison of Bug Finding Tools for Java,” in Proceedings of the 15th IEEE International Symposium on Software Reliability Engineering, France, November 2004.

[72] S. Wagner, J. Jurjens, C. Koller, P. Trischberger, “Comparing Bug Finding Tools with Reviews and Tests,” in Proceedings of the 17th International Conference on Testing of Communication Systems, 2005, pp. 40-55.

[73] S. Wagner, F. Deissenboeck, J. Wimmer, M. Aichner, M. Schwab, “An Evaluation of Two Bug Pattern Tools for Java,” to appear in Proceedings of the 1st IEEE International Conference on Software Testing, Verification and Validation, 2008.

[74] N. Ayewah, W. Pugh, J. Morgenthaler, J. Penix, Y. Zhou, “Evaluating Static Analysis Defect Warnings On Production Software,” in Proceedings of the 7th ACM SIGPLAN-SIGSOFT workshop on Program analysis for software tools and engineering, 2007, pp. 1-8.

[75] T. Aslam, “A taxonomy of security faults in the unix operating system,” M.S. Thesis, Purdue University, 1995.



Custom FindBugs Detectors

If you're familiar with FindBugs (an open source project), I've written a few simple custom detectors for enforcing a few of these heuristics, namely:


This page was partially created using Freemind.

Copyright © 2008 Michael S. Ware