add missing ceedling files

This commit is contained in:
hathach 2022-12-08 09:54:15 +07:00
parent 91d5fa5639
commit 4b50ca2a61
23 changed files with 1909 additions and 0 deletions

View File

@ -0,0 +1,76 @@
ceedling-bullseye
=================
# Plugin Overview
Plugin for integrating Bullseye code coverage tool into Ceedling projects.
This plugin requires a working license to Bullseye code coverage tools. The tools
must be within the path or the path should be added to the environment in the
`project.yml file`.
## Configuration
The bullseye plugin supports configuration options via your `project.yml` provided
by Ceedling. The following is a typical configuration example:
```
:bullseye:
:auto_license: TRUE
:plugins:
:bullseye_lib_path: []
:paths:
:bullseye_toolchain_include: []
:tools:
:bullseye_instrumentation:
:executable: covc
:arguments:
- '--file $': ENVIRONMENT_COVFILE
- -q
- ${1}
:bullseye_compiler:
:executable: gcc
:arguments:
- -g
- -I"$": COLLECTION_PATHS_TEST_SUPPORT_SOURCE_INCLUDE_VENDOR
- -I"$": COLLECTION_PATHS_BULLSEYE_TOOLCHAIN_INCLUDE
- -D$: COLLECTION_DEFINES_TEST_AND_VENDOR
- -DBULLSEYE_COMPILER
- -c "${1}"
- -o "${2}"
:bullseye_linker:
:executable: gcc
:arguments:
- ${1}
- -o ${2}
- -L$: PLUGINS_BULLSEYE_LIB_PATH
- -lcov
:bullseye_fixture:
:executable: ${1}
:bullseye_report_covsrc:
:executable: covsrc
:arguments:
- '--file $': ENVIRONMENT_COVFILE
- -q
- -w140
:bullseye_report_covfn:
:executable: covfn
:stderr_redirect: :auto
:arguments:
- '--file $': ENVIRONMENT_COVFILE
- --width 120
- --no-source
- '"${1}"'
:bullseye_browser:
:executable: CoverageBrowser
:background_exec: :auto
:optional: TRUE
:arguments:
- '"$"': ENVIRONMENT_COVFILE
```
## Example Usage
```sh
ceedling bullseye:all utils:bullseye
```

View File

@ -0,0 +1,20 @@
ceedling-colour-report
======================
## Overview
The colour_report replaces the normal ceedling "pretty" output with
a colorized variant, in order to make the results easier to read from
a standard command line. This is very useful on developer machines, but
can occasionally cause problems with parsing on CI servers.
## Setup
Enable the plugin in your project.yml by adding `colour_report`
to the list of enabled plugins.
``` YAML
:plugins:
:enabled:
- colour_report
```

View File

@ -0,0 +1,29 @@
compile_commands_json
=====================
## Overview
Syntax highlighting and code completion are hard. Historically each editor or IDE has implemented their own and then competed amongst themselves to offer the best experience for developers. Often developers would still to an IDE that felt cumbersome and slow just because it had the best syntax highlighting on the market. If doing it for one language is hard (and it is) imagine doing it for dozens of them. Imagine a full stack developer who has to work with CSS, HTML, JavaScript and some Ruby - they need excellent support in all those languages which just made things even harder.
In June of 2016, Microsoft with Red Hat and Codenvy got together to create a standard called the Language Server Protocol (LSP). The idea was simple, by standardising on one protocol, all the IDEs and editors out there would only have to support LSP, and not have custom plugins for each language. In turn, the backend code that actually does the highlighting can be written once and used by any IDE that supports LSP. Many editors already support it such as Sublime Text, vim and emacs. This means that if you're using a crufty old IDE or worse, you're using a shiny new editor without code completion, then this could be just the upgrade you're looking for!
For C and C++ projects, many people use the `clangd` backend. So that it can do things like "go to definition", `clangd` needs to know how to build the project so that it can figure out all the pieces to the puzzle. There are manual tools such as `bear` which can be run with `gcc` or `clang` to extract this information it has a big limitation in that if run with `ceedling release` you won't get any auto completion for Unity and you'll also get error messages reported by your IDE because of what it perceives as missing headers. If you do the same with `ceedling test` now you get Unity but you might miss things that are only seen in the release build.
This plugin resolves that issue. As it is run by Ceedling, it has access to all the build information it needs to create the perfect `compile_commands.json`. Once enabled, this plugin will generate that file and place it in `./build/artifacts/compile_commands.json`. `clangd` will search your project for this file, but it is easier to symlink it into the root directory (for example `ln -s ./build/artifacts/compile_commands.json`.
For more information on LSP and to find out if your editor supports it, check out https://langserver.org/
## Setup
Enable the plugin in your project.yml by adding `compile_commands_json` to the list
of enabled plugins.
``` YAML
:plugins:
:enabled:
- compile_commands_json
```
## Configuration
There is no additional configuration necessary to run this plugin.

View File

@ -0,0 +1,35 @@
require 'ceedling/plugin'
require 'ceedling/constants'
require 'json'
class CompileCommandsJson < Plugin
def setup
@fullpath = File.join(PROJECT_BUILD_ARTIFACTS_ROOT, "compile_commands.json")
@database = if (File.exists?(@fullpath))
JSON.parse( File.read(@fullpath) )
else
[]
end
end
def post_compile_execute(arg_hash)
# Create the new Entry
value = {
"directory" => Dir.pwd,
"command" => arg_hash[:shell_command],
"file" => arg_hash[:source]
}
# Determine if we're updating an existing file description or adding a new one
index = @database.index {|h| h["file"] == arg_hash[:source]}
if index
@database[index] = value
else
@database << value
end
# Update the Actual compile_commands.json file
File.open(@fullpath,'w') {|f| f << JSON.pretty_generate(@database)}
end
end

View File

@ -0,0 +1,254 @@
ceedling-dependencies
=====================
Plugin for supporting release dependencies. It's rare for an embedded project to
be built completely free of other libraries and modules. Some of these may be
standard internal libraries. Some of these may be 3rd party libraries. In either
case, they become part of the project's ecosystem.
This plugin is intended to make that relationship easier. It allows you to specify
a source for dependencies. If required, it will automatically grab the appropriate
version of that dependency.
Most 3rd party libraries have a method of building already in place. While we'd
love to convert the world to a place where everything downloads with a test suite
in Ceedling, that's not likely to happen anytime soon. Until then, this plugin
will allow the developer to specify what calls Ceedling should make to oversee
the build process of those third party utilities. Are they using Make? CMake? A
custom series of scripts that only a mad scientist could possibly understand? No
matter. Ceedling has you covered. Just specify what should be called, and Ceedling
will make it happen whenever it notices that the output artifacts are missing.
Output artifacts? Sure! Things like static and dynamic libraries, or folders
containing header files that might want to be included by your release project.
So how does all this magic work?
First, you need to add the `:dependencies` plugin to your list. Then, we'll add a new
section called :dependencies. There, you can list as many dependencies as you desire. Each
has a series of fields which help Ceedling to understand your needs. Many of them are
optional. If you don't need that feature, just don't include it! In the end, it'll look
something like this:
```
:dependencies:
:libraries:
- :name: WolfSSL
:source_path: third_party/wolfssl/source
:build_path: third_party/wolfssl/build
:artifact_path: third_party/wolfssl/install
:fetch:
:method: :zip
:source: \\shared_drive\third_party_libs\wolfssl\wolfssl-4.2.0.zip
:environment:
- CFLAGS+=-DWOLFSSL_DTLS_ALLOW_FUTURE
:build:
- "autoreconf -i"
- "./configure --enable-tls13 --enable-singlethreaded"
- make
- make install
:artifacts:
:static_libraries:
- lib/wolfssl.a
:dynamic_libraries:
- lib/wolfssl.so
:includes:
- include/**
```
Let's take a deeper look at each of these features.
The Starting Dash & Name
------------------------
Yes, that opening dash tells the dependencies plugin that the rest of these fields
belong to our first dependency. If we had a second dependency, we'd have another
dash, lined up with the first, and followed by all the fields indented again.
By convention, we use the `:name` field as the first field for each tool. Ceedling
honestly doesn't care which order the fields are given... but as humans, it makes
it easier for us to see the name of each dependency with starting dash.
The name field is only used to print progress while we're running Ceedling. You may
call the name of the field whatever you wish.
Working Folders
---------------
The `:source_path` field allows us to specify where the source code for each of our
dependencies is stored. If fetching the dependency from elsewhere, it will be fetched
to this location. All commands to build this dependency will be executed from
this location (override this by specifying a `:build_path`). Finally, the output
artifacts will be referenced to this location (override this by specifying a `:artifact_path`)
If unspecified, the `:source_path` will be `dependencies\dep_name` where `dep_name`
is the name specified in `:name` above (with special characters removed). It's best,
though, if you specify exactly where you want your dependencies to live.
If the dependency is directly included in your project (you've specified `:none` as the
`:method` for fetching), then `:source_path` should be where your Ceedling can find the
source for your dependency in you repo.
All artifacts are relative to the `:artifact_path` (which defaults to be the same as
`:source_path`)
Fetching Dependencies
---------------------
The `:dependencies` plugin supports the ability to automatically fetch your dependencies
for you... using some common methods of fetching source. This section contains only a
couple of fields:
- `:method` -- This is the method that this dependency is fetched.
- `:none` -- This tells Ceedling that the code is already included in the project.
- `:zip` -- This tells Ceedling that we want to unpack a zip file to our source path.
- `:git` -- This tells Ceedling that we want to clone a git repo to our source path.
- `:svn` -- This tells Ceedling that we want to checkout a subversion repo to our source path.
- `:custom` -- This tells Ceedling that we want to use a custom command or commands to fetch the code.
- `:source` -- This is the path or url to fetch code when using the zip or git method.
- `:tag`/`:branch` -- This is the specific tag or branch that you wish to retrieve (git only. optional).
- `:hash` -- This is the specific SHA1 hash you want to fetch (git only. optional, requires a deep clone).
- `:revision` -- This is the specific revision you want to fetch (svn only. optional).
- `:executable` -- This is a list of commands to execute when using the `:custom` method
Environment Variables
---------------------
Many build systems support customization through environment variables. By specifying
an array of environment variables, Ceedling will customize the shell environment before
calling the build process.
Environment variables may be specified in three ways. Let's look at one of each:
```
:environment:
- ARCHITECTURE=ARM9
- CFLAGS+=-DADD_AWESOMENESS
- CFLAGS-=-DWASTE
```
In the first example, you see the most straightforward method. The environment variable
`ARCHITECTURE` is set to the value `ARM9`. That's it. Simple.
The next two options modify an existing symbol. In the first one, we use `+=`, which tells
Ceedling to add the define `ADD_AWESOMENESS` to the environment variable `CFLAGS`. The second
tells Ceedling to remove the define `WASTE` from the same environment variable.
There are a couple of things to note here.
First, when adding to a variable, Ceedling has no way of knowing
what delimiter you are expecting. In this example you can see we manually added some whitespace.
If we had been modifying `PATH` instead, we might have had to use a `:` on a unux or `;` on
Windows.
Second, removing an argument will have no effect on the argument if that argument isn't found
precisely. It's case sensitive and the entire string must match. If symbol doesn't already exist,
it WILL after executing this command... however it will be assigned to nothing.
Building Dependencies
---------------------
The heart of the `:dependencies` plugin is the ability for you, the developer, to specify the
build process for each of your dependencies. You will need to have any required tools installed
before using this feature.
The steps are specified as an array of strings. Ceedling will execute those steps in the order
specified, moving from step to step unless an error is encountered. By the end of the process,
the artifacts should have been created by your process... otherwise an error will be produced.
Artifacts
---------
These are the outputs of the build process. There are there types of artifacts. Any dependency
may have none or some of these. Calling out these files tells Ceedling that they are important.
Your dependency's build process may produce many other files... but these are the files that
Ceedling understands it needs to act on.
### `static_libraries`
Specifying one or more static libraries will tell Ceedling where it should find static libraries
output by your build process. These libraries are automatically added to the list of dependencies
and will be linked with the rest of your code to produce the final release.
If any of these libraries don't exist, Ceedling will trigger your build process in order for it
to produce them.
### `dynamic_libraries`
Specifying one or more dynamic libraries will tell Ceedling where it should find dynamic libraries
output by your build process. These libraries are automatically copied to the same folder as your
final release binary.
If any of these libraries don't exist, Ceedling will trigger your build process in order for it
to produce them.
### `includes`
Often when libraries are built, the same process will output a collection of includes so that
your release code knows how to interact with that library. It's the public API for that library.
By specifying the directories that will contain these includes (don't specify the files themselves,
Ceedling only needs the directories), Ceedling is able to automatically add these to its internal
include list. This allows these files to be used while building your release code, as well we making
them mockable during unit testing.
### `source`
It's possible that your external dependency will just produce additional C files as its output.
In this case, Ceedling is able to automatically add these to its internal source list. This allows
these files to be used while building your release code.
Tasks
-----
Once configured correctly, the `:dependencies` plugin should integrate seamlessly into your
workflow and you shouldn't have to think about it. In the real world, that doesn't always happen.
Here are a number of tasks that are added or modified by this plugin.
### `ceedling dependencies:clean`
This can be issued in order to completely remove the dependency from its source path. On the
next build, it will be refetched and rebuilt from scratch. This can also apply to a particular
dependency. For example, by specifying `dependencies:clean:DepName`.
### `ceedling dependencies:fetch`
This can be issued in order to fetch each dependency from its origin. This will have no effect on
dependencies that don't have fetch instructions specified. This can also apply to a particular
dependency. For example, by specifying `dependencies:fetch:DepName`.
### `ceedling dependencies:make`
This will force the dependencies to all build. This should happen automatically when a release
has been triggered... but if you're just getting your dependency configured at this moment, you
may want to just use this feature instead. A single dependency can also be built by specifying its
name, like `dependencies:make:MyTunaBoat`.
### `ceedling dependencies:deploy`
This will force any dynamic libraries produced by your dependencies to be copied to your release
build directory... just in case you clobbered them.
### `paths:include`
Maybe you want to verify that all the include paths are correct. If you query Ceedling with this
request, it will list all the header file paths that it's found, including those produced by
dependencies.
### `files:include`
Maybe you want to take that query further and actually get a list of ALL the header files
Ceedling has found, including those belonging to your dependencies.
Testing
=======
Hopefully all your dependencies are fully tested... but we can't always depend on that.
In the event that they are tested with Ceedling, you'll probably want to consider using
the `:subprojects` plugin instead of this one. The purpose of this plugin is to pull in
third party code for release... and to provide a mockable interface for Ceedling to use
during its tests of other modules.
If that's what you're after... you've found the right plugin!
Happy Testing!

View File

@ -0,0 +1,5 @@
---
:dependencies:
:libraries: []
...

View File

@ -0,0 +1,147 @@
DEPENDENCIES_LIBRARIES.each do |deplib|
# Look up the name of this dependency library
deplib_name = @ceedling[DEPENDENCIES_SYM].get_name(deplib)
# Make sure the required working directories exists
# (don't worry about the subdirectories. That's the job of the dep's build tool)
paths = @ceedling[DEPENDENCIES_SYM].get_working_paths(deplib)
paths.each {|path| directory(path) }
task :directories => paths
all_deps = @ceedling[DEPENDENCIES_SYM].get_static_libraries_for_dependency(deplib) +
@ceedling[DEPENDENCIES_SYM].get_dynamic_libraries_for_dependency(deplib) +
@ceedling[DEPENDENCIES_SYM].get_include_directories_for_dependency(deplib) +
@ceedling[DEPENDENCIES_SYM].get_source_files_for_dependency(deplib)
# Add a rule for building the actual libraries from dependency list
(@ceedling[DEPENDENCIES_SYM].get_static_libraries_for_dependency(deplib) +
@ceedling[DEPENDENCIES_SYM].get_dynamic_libraries_for_dependency(deplib)
).each do |libpath|
file libpath do |filetask|
path = filetask.name
# We double-check that it doesn't already exist, because this process sometimes
# produces multiple files, but they may have already been flagged as invoked
unless (File.exists?(path))
# Set Environment Variables, Fetch, and Build
@ceedling[DEPENDENCIES_SYM].set_env_if_required(path)
@ceedling[DEPENDENCIES_SYM].fetch_if_required(path)
@ceedling[DEPENDENCIES_SYM].build_if_required(path)
end
end
end
# Add a rule for building the source and includes from dependency list
(@ceedling[DEPENDENCIES_SYM].get_include_directories_for_dependency(deplib) +
@ceedling[DEPENDENCIES_SYM].get_source_files_for_dependency(deplib)
).each do |libpath|
task libpath do |filetask|
path = filetask.name
unless (File.file?(path) || File.directory?(path))
# Set Environment Variables, Fetch, and Build
@ceedling[DEPENDENCIES_SYM].set_env_if_required(path)
@ceedling[DEPENDENCIES_SYM].fetch_if_required(path)
@ceedling[DEPENDENCIES_SYM].build_if_required(path)
end
end
end
# Give ourselves a way to trigger individual dependencies
namespace DEPENDENCIES_SYM do
namespace :deploy do
# Add task to directly just build this dependency
task(deplib_name => @ceedling[DEPENDENCIES_SYM].get_dynamic_libraries_for_dependency(deplib)) do |t,args|
@ceedling[DEPENDENCIES_SYM].deploy_if_required(deplib_name)
end
end
namespace :make do
# Add task to directly just build this dependency
task(deplib_name => all_deps)
end
namespace :clean do
# Add task to directly clobber this dependency
task(deplib_name) do
@ceedling[DEPENDENCIES_SYM].clean_if_required(deplib_name)
end
end
namespace :fetch do
# Add task to directly clobber this dependency
task(deplib_name) do
@ceedling[DEPENDENCIES_SYM].fetch_if_required(deplib_name)
end
end
end
# Add source files to our list of things to build during release
source_files = @ceedling[DEPENDENCIES_SYM].get_source_files_for_dependency(deplib)
task PROJECT_RELEASE_BUILD_TARGET => source_files
# Finally, add the static libraries to our RELEASE build dependency list
static_libs = @ceedling[DEPENDENCIES_SYM].get_static_libraries_for_dependency(deplib)
task RELEASE_SYM => static_libs
# Add the dynamic libraries to our RELEASE task dependency list so that they will be copied automatically
dynamic_libs = @ceedling[DEPENDENCIES_SYM].get_dynamic_libraries_for_dependency(deplib)
task RELEASE_SYM => dynamic_libs
# Add the include dirs / files to our list of dependencies for release
headers = @ceedling[DEPENDENCIES_SYM].get_include_directories_for_dependency(deplib)
task RELEASE_SYM => headers
# Paths to Libraries need to be Added to the Lib Path List
all_libs = static_libs + dynamic_libs
PATHS_LIBRARIES ||= []
all_libs.each {|lib| PATHS_LIBRARIES << File.dirname(lib) }
PATHS_LIBRARIES.uniq!
PATHS_LIBRARIES.reject!{|s| s.empty?}
# Libraries Need to be Added to the Library List
LIBRARIES_SYSTEM ||= []
all_libs.each {|lib| LIBRARIES_SYSTEM << File.basename(lib,'.*').sub(/^lib/,'') }
LIBRARIES_SYSTEM.uniq!
LIBRARIES_SYSTEM.reject!{|s| s.empty?}
end
# Add any artifact:include or :source folders to our release & test includes paths so linking and mocking work.
@ceedling[DEPENDENCIES_SYM].add_headers_and_sources()
# Add tasks for building or cleaning ALL depencies
namespace DEPENDENCIES_SYM do
desc "Deploy missing dependencies."
task :deploy => DEPENDENCIES_LIBRARIES.map{|deplib| "#{DEPENDENCIES_SYM}:deploy:#{@ceedling[DEPENDENCIES_SYM].get_name(deplib)}"}
desc "Build any missing dependencies."
task :make => DEPENDENCIES_LIBRARIES.map{|deplib| "#{DEPENDENCIES_SYM}:make:#{@ceedling[DEPENDENCIES_SYM].get_name(deplib)}"}
desc "Clean all dependencies."
task :clean => DEPENDENCIES_LIBRARIES.map{|deplib| "#{DEPENDENCIES_SYM}:clean:#{@ceedling[DEPENDENCIES_SYM].get_name(deplib)}"}
desc "Fetch all dependencies."
task :fetch => DEPENDENCIES_LIBRARIES.map{|deplib| "#{DEPENDENCIES_SYM}:fetch:#{@ceedling[DEPENDENCIES_SYM].get_name(deplib)}"}
end
namespace :files do
desc "List all collected dependency libraries."
task :dependencies do
puts "dependency files:"
deps = []
DEPENDENCIES_LIBRARIES.each do |deplib|
deps << @ceedling[DEPENDENCIES_SYM].get_static_libraries_for_dependency(deplib)
deps << @ceedling[DEPENDENCIES_SYM].get_dynamic_libraries_for_dependency(deplib)
end
deps.flatten!
deps.sort.each {|dep| puts " - #{dep}"}
puts "file count: #{deps.size}"
end
end
# Make sure that we build dependencies before attempting to tackle any of the unit tests
Rake::Task[:test_deps].enhance ['dependencies:make']

View File

@ -0,0 +1,237 @@
require 'ceedling/plugin'
require 'ceedling/constants'
DEPENDENCIES_ROOT_NAME = 'dependencies'
DEPENDENCIES_TASK_ROOT = DEPENDENCIES_ROOT_NAME + ':'
DEPENDENCIES_SYM = DEPENDENCIES_ROOT_NAME.to_sym
class Dependencies < Plugin
def setup
@plugin_root = File.expand_path(File.join(File.dirname(__FILE__), '..'))
# Set up a fast way to look up dependencies by name or static lib path
@dependencies = {}
@dynamic_libraries = []
DEPENDENCIES_LIBRARIES.each do |deplib|
@dependencies[ deplib[:name] ] = deplib.clone
all_deps = get_static_libraries_for_dependency(deplib) +
get_dynamic_libraries_for_dependency(deplib) +
get_include_directories_for_dependency(deplib) +
get_source_files_for_dependency(deplib)
all_deps.each do |key|
@dependencies[key] = @dependencies[ deplib[:name] ]
end
@dynamic_libraries += get_dynamic_libraries_for_dependency(deplib)
end
end
def config
updates = {
:collection_paths_include => COLLECTION_PATHS_INCLUDE,
:collection_all_headers => COLLECTION_ALL_HEADERS,
}
@ceedling[DEPENDENCIES_SYM].get_include_directories_for_dependency(deplib).each do |incpath|
updates[:collection_paths_include] << incpath
Dir[ File.join(incpath, "*#{EXTENSION_HEADER}") ].each do |f|
updates[:collection_all_headers] << f
end
end
return updates
end
def get_name(deplib)
raise "Each dependency must have a name!" if deplib[:name].nil?
return deplib[:name].gsub(/\W*/,'')
end
def get_source_path(deplib)
return deplib[:source_path] || File.join('dependencies', get_name(deplib))
end
def get_build_path(deplib)
return deplib[:build_path] || deplib[:source_path] || File.join('dependencies', get_name(deplib))
end
def get_artifact_path(deplib)
return deplib[:artifact_path] || deplib[:source_path] || File.join('dependencies', get_name(deplib))
end
def get_working_paths(deplib)
paths = [deplib[:source_path], deplib[:build_path], deplib[:artifact_paths]].compact.uniq
paths = [ File.join('dependencies', get_name(deplib)) ] if (paths.empty?)
return paths
end
def get_static_libraries_for_dependency(deplib)
(deplib[:artifacts][:static_libraries] || []).map {|path| File.join(get_artifact_path(deplib), path)}
end
def get_dynamic_libraries_for_dependency(deplib)
(deplib[:artifacts][:dynamic_libraries] || []).map {|path| File.join(get_artifact_path(deplib), path)}
end
def get_source_files_for_dependency(deplib)
(deplib[:artifacts][:source] || []).map {|path| File.join(get_artifact_path(deplib), path)}
end
def get_include_directories_for_dependency(deplib)
paths = (deplib[:artifacts][:includes] || []).map {|path| File.join(get_artifact_path(deplib), path)}
@ceedling[:file_system_utils].collect_paths(paths)
end
def set_env_if_required(lib_path)
blob = @dependencies[lib_path]
raise "Could not find dependency '#{lib_path}'" if blob.nil?
return if (blob[:environment].nil?)
return if (blob[:environment].empty?)
blob[:environment].each do |e|
m = e.match(/^(\w+)\s*(\+?\-?=)\s*(.*)$/)
unless m.nil?
case m[2]
when "+="
ENV[m[1]] = (ENV[m[1]] || "") + m[3]
when "-="
ENV[m[1]] = (ENV[m[1]] || "").gsub(m[3],'')
else
ENV[m[1]] = m[3]
end
end
end
end
def fetch_if_required(lib_path)
blob = @dependencies[lib_path]
raise "Could not find dependency '#{lib_path}'" if blob.nil?
return if (blob[:fetch].nil?)
return if (blob[:fetch][:method].nil?)
return if (directory(blob[:source_path]) && !Dir.empty?(blob[:source_path]))
steps = case blob[:fetch][:method]
when :none
return
when :zip
[ "gzip -d #{blob[:fetch][:source]}" ]
when :git
branch = blob[:fetch][:tag] || blob[:fetch][:branch] || ''
branch = ("-b " + branch) unless branch.empty?
unless blob[:fetch][:hash].nil?
# Do a deep clone to ensure the commit we want is available
retval = [ "git clone #{branch} #{blob[:fetch][:source]} ." ]
# Checkout the specified commit
retval << "git checkout #{blob[:fetch][:hash]}"
else
# Do a thin clone
retval = [ "git clone #{branch} --depth 1 #{blob[:fetch][:source]} ." ]
end
when :svn
revision = blob[:fetch][:revision] || ''
revision = ("--revision " + branch) unless branch.empty?
retval = [ "svn checkout #{revision} #{blob[:fetch][:source]} ." ]
retval
when :custom
blob[:fetch][:executable]
else
raise "Unknown fetch method '#{blob[:fetch][:method].to_s}' for dependency '#{blob[:name]}'"
end
# Perform the actual fetching
@ceedling[:streaminator].stdout_puts("Fetching dependency #{blob[:name]}...", Verbosity::NORMAL)
Dir.chdir(get_source_path(blob)) do
steps.each do |step|
@ceedling[:tool_executor].exec( step )
end
end
end
def build_if_required(lib_path)
blob = @dependencies[lib_path]
raise "Could not find dependency '#{lib_path}'" if blob.nil?
# We don't clean anything unless we know how to fetch a new copy
if (blob[:build].nil? || blob[:build].empty?)
@ceedling[:streaminator].stdout_puts("Nothing to build for dependency #{blob[:name]}", Verbosity::NORMAL)
return
end
# Perform the build
@ceedling[:streaminator].stdout_puts("Building dependency #{blob[:name]}...", Verbosity::NORMAL)
Dir.chdir(get_build_path(blob)) do
blob[:build].each do |step|
@ceedling[:tool_executor].exec( step )
end
end
end
def clean_if_required(lib_path)
blob = @dependencies[lib_path]
raise "Could not find dependency '#{lib_path}'" if blob.nil?
# We don't clean anything unless we know how to fetch a new copy
if (blob[:fetch].nil? || blob[:fetch][:method].nil? || (blob[:fetch][:method] == :none))
@ceedling[:streaminator].stdout_puts("Nothing to clean for dependency #{blob[:name]}", Verbosity::NORMAL)
return
end
# Perform the actual Cleaning
@ceedling[:streaminator].stdout_puts("Cleaning dependency #{blob[:name]}...", Verbosity::NORMAL)
get_working_paths(blob).each do |path|
FileUtils.rm_rf(path) if File.directory?(path)
end
end
def deploy_if_required(lib_path)
blob = @dependencies[lib_path]
raise "Could not find dependency '#{lib_path}'" if blob.nil?
# We don't need to deploy anything if there isn't anything to deploy
if (blob[:artifacts].nil? || blob[:artifacts][:dynamic_libraries].nil? || blob[:artifacts][:dynamic_libraries].empty?)
@ceedling[:streaminator].stdout_puts("Nothing to deploy for dependency #{blob[:name]}", Verbosity::NORMAL)
return
end
# Perform the actual Deploying
@ceedling[:streaminator].stdout_puts("Deploying dependency #{blob[:name]}...", Verbosity::NORMAL)
FileUtils.cp( lib_path, File.dirname(PROJECT_RELEASE_BUILD_TARGET) )
end
def add_headers_and_sources()
# Search for header file paths and files to add to our collections
DEPENDENCIES_LIBRARIES.each do |deplib|
get_include_directories_for_dependency(deplib).each do |header|
cfg = @ceedling[:configurator].project_config_hash
cfg[:collection_paths_include] << header
cfg[:collection_paths_source_and_include] << header
cfg[:collection_paths_test_support_source_include] << header
cfg[:collection_paths_test_support_source_include_vendor] << header
cfg[:collection_paths_release_toolchain_include] << header
Dir[ File.join(header, "*#{EXTENSION_HEADER}") ].each do |f|
cfg[:collection_all_headers] << f
end
end
get_source_files_for_dependency(deplib).each do |source|
cfg = @ceedling[:configurator].project_config_hash
cfg[:collection_paths_source_and_include] << source
cfg[:collection_paths_test_support_source_include] << source
cfg[:collection_paths_test_support_source_include_vendor] << source
cfg[:collection_paths_release_toolchain_include] << source
Dir[ File.join(source, "*#{EXTENSION_SOURCE}") ].each do |f|
cfg[:collection_all_source] << f
end
end
end
# Make all these updated files findable by Ceedling
@ceedling[:file_finder].prepare_search_sources()
end
end
# end blocks always executed following rake run
END {
}

View File

@ -0,0 +1,118 @@
DEFAULT_GCOV_COMPILER_TOOL = {
:executable => ENV['CC'].nil? ? FilePathUtils.os_executable_ext('gcc').freeze : ENV['CC'].split[0],
:name => 'default_gcov_compiler'.freeze,
:stderr_redirect => StdErrRedirect::NONE.freeze,
:background_exec => BackgroundExec::NONE.freeze,
:optional => false.freeze,
:arguments => [
"-g".freeze,
"-fprofile-arcs".freeze,
"-ftest-coverage".freeze,
ENV['CC'].nil? ? "" : ENV['CC'].split[1..-1],
ENV['CPPFLAGS'].nil? ? "" : ENV['CPPFLAGS'].split,
{"-I\"$\"" => 'COLLECTION_PATHS_TEST_SUPPORT_SOURCE_INCLUDE_VENDOR'}.freeze,
{"-I\"$\"" => 'COLLECTION_PATHS_TEST_TOOLCHAIN_INCLUDE'}.freeze,
{"-D$" => 'COLLECTION_DEFINES_TEST_AND_VENDOR'}.freeze,
"-DGCOV_COMPILER".freeze,
"-DCODE_COVERAGE".freeze,
ENV['CFLAGS'].nil? ? "" : ENV['CFLAGS'].split,
"-c \"${1}\"".freeze,
"-o \"${2}\"".freeze
].freeze
}
DEFAULT_GCOV_LINKER_TOOL = {
:executable => ENV['CCLD'].nil? ? FilePathUtils.os_executable_ext('gcc').freeze : ENV['CCLD'].split[0],
:name => 'default_gcov_linker'.freeze,
:stderr_redirect => StdErrRedirect::NONE.freeze,
:background_exec => BackgroundExec::NONE.freeze,
:optional => false.freeze,
:arguments => [
"-g".freeze,
"-fprofile-arcs".freeze,
"-ftest-coverage".freeze,
ENV['CCLD'].nil? ? "" : ENV['CCLD'].split[1..-1],
ENV['CFLAGS'].nil? ? "" : ENV['CFLAGS'].split,
ENV['LDFLAGS'].nil? ? "" : ENV['LDFLAGS'].split,
"\"${1}\"".freeze,
"-o \"${2}\"".freeze,
"${4}".freeze,
"${5}".freeze,
ENV['LDLIBS'].nil? ? "" : ENV['LDLIBS'].split
].freeze
}
DEFAULT_GCOV_FIXTURE_TOOL = {
:executable => '${1}'.freeze,
:name => 'default_gcov_fixture'.freeze,
:stderr_redirect => StdErrRedirect::NONE.freeze,
:background_exec => BackgroundExec::NONE.freeze,
:optional => false.freeze,
:arguments => [].freeze
}
DEFAULT_GCOV_REPORT_TOOL = {
:executable => ENV['GCOV'].nil? ? FilePathUtils.os_executable_ext('gcov').freeze : ENV['GCOV'].split[0],
:name => 'default_gcov_report'.freeze,
:stderr_redirect => StdErrRedirect::NONE.freeze,
:background_exec => BackgroundExec::NONE.freeze,
:optional => false.freeze,
:arguments => [
"-n".freeze,
"-p".freeze,
"-b".freeze,
{"-o \"$\"" => 'GCOV_BUILD_OUTPUT_PATH'}.freeze,
"\"${1}\"".freeze
].freeze
}
DEFAULT_GCOV_GCOV_POST_REPORT_TOOL = {
:executable => ENV['GCOV'].nil? ? FilePathUtils.os_executable_ext('gcov').freeze : ENV['GCOV'].split[0],
:name => 'default_gcov_gcov_post_report'.freeze,
:stderr_redirect => StdErrRedirect::NONE.freeze,
:background_exec => BackgroundExec::NONE.freeze,
:optional => true.freeze,
:arguments => [
"-b".freeze,
"-c".freeze,
"-r".freeze,
"-x".freeze,
"${1}".freeze
].freeze
}
DEFAULT_GCOV_GCOVR_POST_REPORT_TOOL = {
:executable => 'gcovr'.freeze,
:name => 'default_gcov_gcovr_post_report'.freeze,
:stderr_redirect => StdErrRedirect::NONE.freeze,
:background_exec => BackgroundExec::NONE.freeze,
:optional => true.freeze,
:arguments => [
"${1}".freeze
].freeze
}
DEFAULT_GCOV_REPORTGENERATOR_POST_REPORT = {
:executable => 'reportgenerator'.freeze,
:name => 'default_gcov_reportgenerator_post_report'.freeze,
:stderr_redirect => StdErrRedirect::NONE.freeze,
:background_exec => BackgroundExec::NONE.freeze,
:optional => true.freeze,
:arguments => [
"${1}".freeze
].freeze
}
def get_default_config
return :tools => {
:gcov_compiler => DEFAULT_GCOV_COMPILER_TOOL,
:gcov_linker => DEFAULT_GCOV_LINKER_TOOL,
:gcov_fixture => DEFAULT_GCOV_FIXTURE_TOOL,
:gcov_report => DEFAULT_GCOV_REPORT_TOOL,
:gcov_gcov_post_report => DEFAULT_GCOV_GCOV_POST_REPORT_TOOL,
:gcov_gcovr_post_report => DEFAULT_GCOV_GCOVR_POST_REPORT_TOOL,
:gcov_reportgenerator_post_report => DEFAULT_GCOV_REPORTGENERATOR_POST_REPORT
}
end

View File

@ -0,0 +1,331 @@
require 'reportinator_helper'
class GcovrReportinator
def initialize(system_objects)
@ceedling = system_objects
@reportinator_helper = ReportinatorHelper.new
end
# Generate the gcovr report(s) specified in the options.
def make_reports(opts)
# Get the gcovr version number.
gcovr_version_info = get_gcovr_version()
# Build the common gcovr arguments.
args_common = args_builder_common(opts)
if ((gcovr_version_info[0] == 4) && (gcovr_version_info[1] >= 2)) || (gcovr_version_info[0] > 4)
# gcovr version 4.2 and later supports generating multiple reports with a single call.
args = args_common
args += args_builder_cobertura(opts, false)
args += args_builder_sonarqube(opts, false)
args += args_builder_json(opts, true)
# As of gcovr version 4.2, the --html argument must appear last.
args += args_builder_html(opts, false)
print "Creating gcov results report(s) in '#{GCOV_ARTIFACTS_PATH}'... "
STDOUT.flush
# Generate the report(s).
run(args)
else
# gcovr version 4.1 and earlier supports HTML and Cobertura XML reports.
# It does not support SonarQube and JSON reports.
# Reports must also be generated separately.
args_cobertura = args_builder_cobertura(opts, true)
args_html = args_builder_html(opts, true)
if args_html.length > 0
print "Creating a gcov HTML report in '#{GCOV_ARTIFACTS_PATH}'... "
STDOUT.flush
# Generate the HTML report.
run(args_common + args_html)
end
if args_cobertura.length > 0
print "Creating a gcov XML report in '#{GCOV_ARTIFACTS_PATH}'... "
STDOUT.flush
# Generate the Cobertura XML report.
run(args_common + args_cobertura)
end
end
# Determine if the gcovr text report is enabled. Defaults to disabled.
if is_report_enabled(opts, ReportTypes::TEXT)
make_text_report(opts, args_common)
end
end
def support_deprecated_options(opts)
# Support deprecated :html_report: and ":html_report_type: basic" options.
if !is_report_enabled(opts, ReportTypes::HTML_BASIC) && (opts[:gcov_html_report] || (opts[:gcov_html_report_type].is_a? String) && (opts[:gcov_html_report_type].casecmp("basic") == 0))
opts[:gcov_reports].push(ReportTypes::HTML_BASIC)
end
# Support deprecated ":html_report_type: detailed" option.
if !is_report_enabled(opts, ReportTypes::HTML_DETAILED) && (opts[:gcov_html_report_type].is_a? String) && (opts[:gcov_html_report_type].casecmp("detailed") == 0)
opts[:gcov_reports].push(ReportTypes::HTML_DETAILED)
end
# Support deprecated :xml_report: option.
if opts[:gcov_xml_report]
opts[:gcov_reports].push(ReportTypes::COBERTURA)
end
# Default to HTML basic report when no report types are defined.
if opts[:gcov_reports].empty? && opts[:gcov_html_report_type].nil? && opts[:gcov_xml_report].nil?
opts[:gcov_reports] = [ReportTypes::HTML_BASIC]
puts "In your project.yml, define one or more of the"
puts "following to specify which reports to generate."
puts "For now, creating only an #{ReportTypes::HTML_BASIC} report."
puts ""
puts ":gcov:"
puts " :reports:"
puts " - #{ReportTypes::HTML_BASIC}"
puts " - #{ReportTypes::HTML_DETAILED}"
puts " - #{ReportTypes::TEXT}"
puts " - #{ReportTypes::COBERTURA}"
puts " - #{ReportTypes::SONARQUBE}"
puts " - #{ReportTypes::JSON}"
puts ""
end
end
private
GCOVR_SETTING_PREFIX = "gcov_gcovr"
# Build the gcovr report generation common arguments.
def args_builder_common(opts)
gcovr_opts = get_opts(opts)
args = ""
args += "--root \"#{gcovr_opts[:report_root] || '.'}\" "
args += "--config \"#{gcovr_opts[:config_file]}\" " unless gcovr_opts[:config_file].nil?
args += "--filter \"#{gcovr_opts[:report_include]}\" " unless gcovr_opts[:report_include].nil?
args += "--exclude \"#{gcovr_opts[:report_exclude] || GCOV_FILTER_EXCLUDE}\" "
args += "--gcov-filter \"#{gcovr_opts[:gcov_filter]}\" " unless gcovr_opts[:gcov_filter].nil?
args += "--gcov-exclude \"#{gcovr_opts[:gcov_exclude]}\" " unless gcovr_opts[:gcov_exclude].nil?
args += "--exclude-directories \"#{gcovr_opts[:exclude_directories]}\" " unless gcovr_opts[:exclude_directories].nil?
args += "--branches " if gcovr_opts[:branches].nil? || gcovr_opts[:branches] # Defaults to enabled.
args += "--sort-uncovered " if gcovr_opts[:sort_uncovered]
args += "--sort-percentage " if gcovr_opts[:sort_percentage].nil? || gcovr_opts[:sort_percentage] # Defaults to enabled.
args += "--print-summary " if gcovr_opts[:print_summary]
args += "--gcov-executable \"#{gcovr_opts[:gcov_executable]}\" " unless gcovr_opts[:gcov_executable].nil?
args += "--exclude-unreachable-branches " if gcovr_opts[:exclude_unreachable_branches]
args += "--exclude-throw-branches " if gcovr_opts[:exclude_throw_branches]
args += "--use-gcov-files " if gcovr_opts[:use_gcov_files]
args += "--gcov-ignore-parse-errors " if gcovr_opts[:gcov_ignore_parse_errors]
args += "--keep " if gcovr_opts[:keep]
args += "--delete " if gcovr_opts[:delete]
args += "-j #{gcovr_opts[:num_parallel_threads]} " if !(gcovr_opts[:num_parallel_threads].nil?) && (gcovr_opts[:num_parallel_threads].is_a? Integer)
[:fail_under_line, :fail_under_branch, :source_encoding, :object_directory].each do |opt|
unless gcovr_opts[opt].nil?
value = gcovr_opts[opt]
if (opt == :fail_under_line) || (opt == :fail_under_branch)
if not value.is_a? Integer
puts "Option value #{opt} has to be an integer"
value = nil
elsif (value < 0) || (value > 100)
puts "Option value #{opt} has to be a percentage from 0 to 100"
value = nil
end
end
args += "--#{opt.to_s.gsub('_','-')} #{value} " unless value.nil?
end
end
return args
end
# Build the gcovr Cobertura XML report generation arguments.
def args_builder_cobertura(opts, use_output_option=false)
gcovr_opts = get_opts(opts)
args = ""
# Determine if the Cobertura XML report is enabled. Defaults to disabled.
if is_report_enabled(opts, ReportTypes::COBERTURA)
# Determine the Cobertura XML report file name.
artifacts_file_cobertura = GCOV_ARTIFACTS_FILE_COBERTURA
if !(gcovr_opts[:cobertura_artifact_filename].nil?)
artifacts_file_cobertura = File.join(GCOV_ARTIFACTS_PATH, gcovr_opts[:cobertura_artifact_filename])
elsif !(gcovr_opts[:xml_artifact_filename].nil?)
artifacts_file_cobertura = File.join(GCOV_ARTIFACTS_PATH, gcovr_opts[:xml_artifact_filename])
end
args += "--xml-pretty " if gcovr_opts[:xml_pretty] || gcovr_opts[:cobertura_pretty]
args += "--xml #{use_output_option ? "--output " : ""} \"#{artifacts_file_cobertura}\" "
end
return args
end
# Build the gcovr SonarQube report generation arguments.
def args_builder_sonarqube(opts, use_output_option=false)
gcovr_opts = get_opts(opts)
args = ""
# Determine if the gcovr SonarQube XML report is enabled. Defaults to disabled.
if is_report_enabled(opts, ReportTypes::SONARQUBE)
# Determine the SonarQube XML report file name.
artifacts_file_sonarqube = GCOV_ARTIFACTS_FILE_SONARQUBE
if !(gcovr_opts[:sonarqube_artifact_filename].nil?)
artifacts_file_sonarqube = File.join(GCOV_ARTIFACTS_PATH, gcovr_opts[:sonarqube_artifact_filename])
end
args += "--sonarqube #{use_output_option ? "--output " : ""} \"#{artifacts_file_sonarqube}\" "
end
return args
end
# Build the gcovr JSON report generation arguments.
def args_builder_json(opts, use_output_option=false)
gcovr_opts = get_opts(opts)
args = ""
# Determine if the gcovr JSON report is enabled. Defaults to disabled.
if is_report_enabled(opts, ReportTypes::JSON)
# Determine the JSON report file name.
artifacts_file_json = GCOV_ARTIFACTS_FILE_JSON
if !(gcovr_opts[:json_artifact_filename].nil?)
artifacts_file_json = File.join(GCOV_ARTIFACTS_PATH, gcovr_opts[:json_artifact_filename])
end
args += "--json-pretty " if gcovr_opts[:json_pretty]
# Note: In gcovr 4.2, the JSON report is output only when the --output option is specified.
# Hopefully we can remove --output after a future gcovr release.
args += "--json #{use_output_option ? "--output " : ""} \"#{artifacts_file_json}\" "
end
return args
end
# Build the gcovr HTML report generation arguments.
def args_builder_html(opts, use_output_option=false)
gcovr_opts = get_opts(opts)
args = ""
# Determine if the gcovr HTML report is enabled. Defaults to enabled.
html_enabled = (opts[:gcov_html_report].nil? && opts[:gcov_reports].empty?) ||
is_report_enabled(opts, ReportTypes::HTML_BASIC) ||
is_report_enabled(opts, ReportTypes::HTML_DETAILED)
if html_enabled
# Determine the HTML report file name.
artifacts_file_html = GCOV_ARTIFACTS_FILE_HTML
if !(gcovr_opts[:html_artifact_filename].nil?)
artifacts_file_html = File.join(GCOV_ARTIFACTS_PATH, gcovr_opts[:html_artifact_filename])
end
is_html_report_type_detailed = (opts[:gcov_html_report_type].is_a? String) && (opts[:gcov_html_report_type].casecmp("detailed") == 0)
args += "--html-details " if is_html_report_type_detailed || is_report_enabled(opts, ReportTypes::HTML_DETAILED)
args += "--html-title \"#{gcovr_opts[:html_title]}\" " unless gcovr_opts[:html_title].nil?
args += "--html-absolute-paths " if !(gcovr_opts[:html_absolute_paths].nil?) && gcovr_opts[:html_absolute_paths]
args += "--html-encoding \"#{gcovr_opts[:html_encoding]}\" " unless gcovr_opts[:html_encoding].nil?
[:html_medium_threshold, :html_high_threshold].each do |opt|
args += "--#{opt.to_s.gsub('_','-')} #{gcovr_opts[opt]} " unless gcovr_opts[opt].nil?
end
# The following option must be appended last for gcovr version <= 4.2 to properly work.
args += "--html #{use_output_option ? "--output " : ""} \"#{artifacts_file_html}\" "
end
return args
end
# Generate a gcovr text report.
def make_text_report(opts, args_common)
gcovr_opts = get_opts(opts)
args_text = ""
message_text = "Creating a gcov text report"
if !(gcovr_opts[:text_artifact_filename].nil?)
artifacts_file_txt = File.join(GCOV_ARTIFACTS_PATH, gcovr_opts[:text_artifact_filename])
args_text += "--output \"#{artifacts_file_txt}\" "
message_text += " in '#{GCOV_ARTIFACTS_PATH}'... "
else
message_text += "... "
end
print message_text
STDOUT.flush
# Generate the text report.
run(args_common + args_text)
end
# Get the gcovr options from the project options.
def get_opts(opts)
return opts[GCOVR_SETTING_PREFIX.to_sym] || {}
end
# Run gcovr with the given arguments.
def run(args)
begin
command = @ceedling[:tool_executor].build_command_line(TOOLS_GCOV_GCOVR_POST_REPORT, [], args)
shell_result = @ceedling[:tool_executor].exec(command[:line], command[:options])
@reportinator_helper.print_shell_result(shell_result)
rescue
# handle any unforeseen issues with called tool
exitcode = $?.exitstatus
show_gcovr_message(exitcode)
exit(exitcode)
end
end
# Get the gcovr version number as components.
# Returns [major, minor].
def get_gcovr_version()
version_number_major = 0
version_number_minor = 0
command = @ceedling[:tool_executor].build_command_line(TOOLS_GCOV_GCOVR_POST_REPORT, [], "--version")
shell_result = @ceedling[:tool_executor].exec(command[:line], command[:options])
version_number_match_data = shell_result[:output].match(/gcovr ([0-9]+)\.([0-9]+)/)
if !(version_number_match_data.nil?) && !(version_number_match_data[1].nil?) && !(version_number_match_data[2].nil?)
version_number_major = version_number_match_data[1].to_i
version_number_minor = version_number_match_data[2].to_i
end
return version_number_major, version_number_minor
end
# Show a more human-friendly message on gcovr return code
def show_gcovr_message(exitcode)
if ((exitcode & 2) == 2)
puts "The line coverage is less than the minimum"
end
if ((exitcode & 4) == 4)
puts "The branch coverage is less than the minimum"
end
end
# Returns true if the given report type is enabled, otherwise returns false.
def is_report_enabled(opts, report_type)
return !(opts.nil?) && !(opts[:gcov_reports].nil?) && (opts[:gcov_reports].map(&:upcase).include? report_type.upcase)
end
end

View File

@ -0,0 +1,195 @@
require 'benchmark'
require 'reportinator_helper'
class ReportGeneratorReportinator
def initialize(system_objects)
@ceedling = system_objects
@reportinator_helper = ReportinatorHelper.new
end
# Generate the ReportGenerator report(s) specified in the options.
def make_reports(opts)
shell_result = nil
total_time = Benchmark.realtime do
rg_opts = get_opts(opts)
print "Creating gcov results report(s) with ReportGenerator in '#{GCOV_REPORT_GENERATOR_PATH}'... "
STDOUT.flush
# Cleanup any existing .gcov files to avoid reporting old coverage results.
for gcov_file in Dir.glob("*.gcov")
File.delete(gcov_file)
end
# Use a custom gcov executable, if specified.
GCOV_TOOL_CONFIG[:executable] = rg_opts[:gcov_executable] unless rg_opts[:gcov_executable].nil?
# Avoid running gcov on the mock, test, unity, and cexception gcov notes files to save time.
gcno_exclude_str = "#{opts[:cmock_mock_prefix]}.*"
gcno_exclude_str += "|#{opts[:project_test_file_prefix]}.*"
gcno_exclude_str += "|#{VENDORS_FILES.join('|')}"
# Avoid running gcov on custom specified .gcno files.
if !(rg_opts.nil?) && !(rg_opts[:gcov_exclude].nil?) && !(rg_opts[:gcov_exclude].empty?)
for gcno_exclude_expression in rg_opts[:gcov_exclude]
if !(gcno_exclude_expression.nil?) && !(gcno_exclude_expression.empty?)
# We want to filter .gcno files, not .gcov files.
# We will generate .gcov files from .gcno files.
gcno_exclude_expression = gcno_exclude_expression.chomp("\\.gcov")
gcno_exclude_expression = gcno_exclude_expression.chomp(".gcov")
# The .gcno extension will be added later as we create the regex.
gcno_exclude_expression = gcno_exclude_expression.chomp("\\.gcno")
gcno_exclude_expression = gcno_exclude_expression.chomp(".gcno")
# Append the custom expression.
gcno_exclude_str += "|#{gcno_exclude_expression}"
end
end
end
gcno_exclude_regex = /(\/|\\)(#{gcno_exclude_str})\.gcno/
# Generate .gcov files by running gcov on gcov notes files (*.gcno).
for gcno_filepath in Dir.glob(File.join(GCOV_BUILD_PATH, "**", "*.gcno"))
match_data = gcno_filepath.match(gcno_exclude_regex)
if match_data.nil? || (match_data[1].nil? && match_data[1].nil?)
# Ensure there is a matching gcov data file.
if File.file?(gcno_filepath.gsub(".gcno", ".gcda"))
run_gcov("\"#{gcno_filepath}\"")
end
end
end
if Dir.glob("*.gcov").length > 0
# Build the command line arguments.
args = args_builder(opts)
# Generate the report(s).
shell_result = run(args)
else
puts "\nWarning: No matching .gcno coverage files found."
end
# Cleanup .gcov files.
for gcov_file in Dir.glob("*.gcov")
File.delete(gcov_file)
end
end
if shell_result
shell_result[:time] = total_time
@reportinator_helper.print_shell_result(shell_result)
end
end
private
# A dictionary of report types defined in this plugin to ReportGenerator report types.
REPORT_TYPE_TO_REPORT_GENERATOR_REPORT_NAME = {
ReportTypes::HTML_BASIC.upcase => "HtmlSummary",
ReportTypes::HTML_DETAILED.upcase => "Html",
ReportTypes::HTML_CHART.upcase => "HtmlChart",
ReportTypes::HTML_INLINE.upcase => "HtmlInline",
ReportTypes::HTML_INLINE_AZURE.upcase => "HtmlInline_AzurePipelines",
ReportTypes::HTML_INLINE_AZURE_DARK.upcase => "HtmlInline_AzurePipelines_Dark",
ReportTypes::MHTML.upcase => "MHtml",
ReportTypes::TEXT.upcase => "TextSummary",
ReportTypes::COBERTURA.upcase => "Cobertura",
ReportTypes::SONARQUBE.upcase => "SonarQube",
ReportTypes::BADGES.upcase => "Badges",
ReportTypes::CSV_SUMMARY.upcase => "CsvSummary",
ReportTypes::LATEX.upcase => "Latex",
ReportTypes::LATEX_SUMMARY.upcase => "LatexSummary",
ReportTypes::PNG_CHART.upcase => "PngChart",
ReportTypes::TEAM_CITY_SUMMARY.upcase => "TeamCitySummary",
ReportTypes::LCOV.upcase => "lcov",
ReportTypes::XML.upcase => "Xml",
ReportTypes::XML_SUMMARY.upcase => "XmlSummary",
}
REPORT_GENERATOR_SETTING_PREFIX = "gcov_report_generator"
# Deep clone the gcov tool config, so we can modify it locally if specified via options.
GCOV_TOOL_CONFIG = Marshal.load(Marshal.dump(TOOLS_GCOV_GCOV_POST_REPORT))
# Build the ReportGenerator arguments.
def args_builder(opts)
rg_opts = get_opts(opts)
report_type_count = 0
args = ""
args += "\"-reports:*.gcov\" "
args += "\"-targetdir:\"#{GCOV_REPORT_GENERATOR_PATH}\"\" "
# Build the report types argument.
if !(opts.nil?) && !(opts[:gcov_reports].nil?) && !(opts[:gcov_reports].empty?)
args += "\"-reporttypes:"
for report_type in opts[:gcov_reports]
rg_report_type = REPORT_TYPE_TO_REPORT_GENERATOR_REPORT_NAME[report_type.upcase]
if !(rg_report_type.nil?)
args += rg_report_type + ";"
report_type_count = report_type_count + 1
end
end
# Removing trailing ';' after the last report type.
args = args.chomp(";")
# Append a space seperator after the report type.
args += "\" "
end
# Build the source directories argument.
args += "\"-sourcedirs:.;"
if !(opts[:collection_paths_source].nil?)
args += opts[:collection_paths_source].join(';')
end
args = args.chomp(";")
args += "\" "
args += "\"-historydir:#{rg_opts[:history_directory]}\" " unless rg_opts[:history_directory].nil?
args += "\"-plugins:#{rg_opts[:plugins]}\" " unless rg_opts[:plugins].nil?
args += "\"-assemblyfilters:#{rg_opts[:assembly_filters]}\" " unless rg_opts[:assembly_filters].nil?
args += "\"-classfilters:#{rg_opts[:class_filters]}\" " unless rg_opts[:class_filters].nil?
file_filters = rg_opts[:file_filters] || @ceedling[:tool_executor_helper].osify_path_separators(GCOV_REPORT_GENERATOR_FILE_FILTERS)
args += "\"-filefilters:#{file_filters}\" "
args += "\"-verbosity:#{rg_opts[:verbosity] || "Warning"}\" "
args += "\"-tag:#{rg_opts[:tag]}\" " unless rg_opts[:tag].nil?
args += "\"settings:createSubdirectoryForAllReportTypes=true\" " unless report_type_count <= 1
args += "\"settings:numberOfReportsParsedInParallel=#{rg_opts[:num_parallel_threads]}\" " unless rg_opts[:num_parallel_threads].nil?
args += "\"settings:numberOfReportsMergedInParallel=#{rg_opts[:num_parallel_threads]}\" " unless rg_opts[:num_parallel_threads].nil?
# Append custom arguments.
if !(rg_opts[:custom_args].nil?) && !(rg_opts[:custom_args].empty?)
for custom_arg in rg_opts[:custom_args]
args += "\"#{custom_arg}\" " unless custom_arg.nil? || custom_arg.empty?
end
end
return args
end
# Get the ReportGenerator options from the project options.
def get_opts(opts)
return opts[REPORT_GENERATOR_SETTING_PREFIX.to_sym] || {}
end
# Run ReportGenerator with the given arguments.
def run(args)
command = @ceedling[:tool_executor].build_command_line(TOOLS_GCOV_REPORTGENERATOR_POST_REPORT, [], args)
return @ceedling[:tool_executor].exec(command[:line], command[:options])
end
# Run gcov with the given arguments.
def run_gcov(args)
command = @ceedling[:tool_executor].build_command_line(GCOV_TOOL_CONFIG, [], args)
return @ceedling[:tool_executor].exec(command[:line], command[:options])
end
end

View File

@ -0,0 +1,15 @@
class ReportinatorHelper
# Output the shell result to the console.
def print_shell_result(shell_result)
if !(shell_result.nil?)
puts "Done in %.3f seconds." % shell_result[:time]
if !(shell_result[:output].nil?) && (shell_result[:output].length > 0)
puts shell_result[:output]
end
end
end
end

View File

@ -0,0 +1,36 @@
json_tests_report
=================
## Overview
The json_tests_report plugin creates a JSON file of test results, which is
handy for Continuous Integration build servers or as input into other
reporting tools. The JSON file is output to the appropriate
`<build_root>/artifacts/` directory (e.g. `artifacts/test/` for test tasks,
`artifacts/gcov/` for gcov, or `artifacts/bullseye/` for bullseye runs).
## Setup
Enable the plugin in your project.yml by adding `json_tests_report` to the list
of enabled plugins.
``` YAML
:plugins:
:enabled:
- json_tests_report
```
## Configuration
Optionally configure the output / artifact filename in your project.yml with
the `artifact_filename` configuration option. The default filename is
`report.json`.
You can also configure the path that this artifact is stored. This can be done
by setting `path`. The default is that it will be placed in a subfolder under
the `build` directory.
``` YAML
:json_tests_report:
:artifact_filename: report_spectuluarly.json
```

View File

@ -0,0 +1,83 @@
require 'ceedling/plugin'
require 'ceedling/constants'
require 'json'
class JsonTestsReport < Plugin
def setup
@results_list = {}
@test_counter = 0
end
def post_test_fixture_execute(arg_hash)
context = arg_hash[:context]
@results_list[context] = [] if @results_list[context].nil?
@results_list[context] << arg_hash[:result_file]
end
def post_build
@results_list.each_key do |context|
results = @ceedling[:plugin_reportinator].assemble_test_results(@results_list[context])
artifact_filename = @ceedling[:configurator].project_config_hash[:json_tests_report_artifact_filename] || 'report.json'
artifact_fullpath = @ceedling[:configurator].project_config_hash[:json_tests_report_path] || File.join(PROJECT_BUILD_ARTIFACTS_ROOT, context.to_s)
file_path = File.join(artifact_fullpath, artifact_filename)
@ceedling[:file_wrapper].open(file_path, 'w') do |f|
@test_counter = 1
json = {
"FailedTests" => write_failures(results[:failures]),
"PassedTests" => write_tests(results[:successes]),
"IgnoredTests" => write_tests(results[:ignores]),
"Summary" => write_statistics(results[:counts])
}
f << JSON.pretty_generate(json)
end
end
end
private
def write_failures(results)
retval = []
results.each do |result|
result[:collection].each do |item|
@test_counter += 1
retval << {
"file" => File.join(result[:source][:path], result[:source][:file]),
"test" => item[:test],
"line" => item[:line],
"message" => item[:message]
}
end
end
return retval.uniq
end
def write_tests(results)
retval = []
results.each do |result|
result[:collection].each do |item|
@test_counter += 1
retval << {
"file" => File.join(result[:source][:path], result[:source][:file]),
"test" => item[:test]
}
end
end
return retval
end
def write_statistics(counts)
return {
"total_tests" => counts[:total],
"passed" => (counts[:total] - counts[:ignored] - counts[:failed]),
"ignored" => counts[:ignored],
"failures" => counts[:failed]
}
end
end

View File

@ -0,0 +1,119 @@
ceedling-module-generator
=========================
## Overview
The module_generator plugin adds a pair of new commands to Ceedling, allowing
you to make or remove modules according to predefined templates. WIth a single call,
Ceedling can generate a source, header, and test file for a new module. If given a
pattern, it can even create a series of submodules to support specific design patterns.
Finally, it can just as easily remove related modules, avoiding the need to delete
each individually.
Let's say, for example, that you want to create a single module named `MadScience`.
```
ceedling module:create[MadScience]
```
It says we're speaking to the module plugin, and we want to create a new module. The
name of that module is between the brackets. It will keep this case, unless you have
specified a different default (see configuration). It will create three files:
`MadScience.c`, `MadScience.h`, and `TestMadScience.c`. *NOTE* that it is important that
there are no spaces between the brackets. We know, it's annoying... but it's the rules.
You can also create an entire pattern of files. To do that, just add a second argument
to the pattern ID. Something like this:
```
ceedling module:create[SecretLair,mch]
```
In this example, we'd create 9 files total: 3 headers, 3 source files, and 3 test files. These
files would be named `SecretLairModel`, `SecretLairConductor`, and `SecretLairHardware`. Isn't
that nice?
Similarly, you can create stubs for all functions in a header file just by making a single call
to your handy `stub` feature, like this:
```
ceedling module:stub[SecretLair]
```
This call will look in SecretLair.h and will generate a file SecretLair.c that contains a stub
for each function declared in the header! Even better, if SecretLair.c already exists, it will
add only new functions, leaving your existing calls alone so that it doesn't cause any problems.
## Configuration
Enable the plugin in your project.yml by adding `module_generator`
to the list of enabled plugins.
Then, like much of Ceedling, you can just run as-is with the defaults, or you can override those
defaults for your own needs. For example, new source and header files will be automatically
placed in the `src/` folder while tests will go in the `test/` folder. That's great if your project
follows the default ceedling structure... but what if you have a different structure?
```
:module_generator:
:project_root: ./
:source_root: source/
:inc_root: includes/
:test_root: tests/
```
Now I've redirected the location where modules are going to be generated.
### Includes
You can make it so that all of your files are generated with a standard include list. This is done
by adding to the `:includes` array. For example:
```
:module_generator:
:includes:
:tst:
- defs.h
- board.h
:src:
- board.h
```
### Boilerplates
You can specify the actual boilerplate used for each of your files. This is the handy place to
put that corporate copyright notice (or maybe a copyleft notice, if that's your perference?)
```
:module_generator:
:boilerplates: |
/***************************
* This file is Awesome. *
* That is All. *
***************************/
```
### Test Defines
You can specify the "#ifdef TEST" at the top of the test files with a custom define.
This example will put a "#ifdef CEEDLING_TEST" at the top of the test files.
```
:module_generator:
:test_define: CEEDLING_TEST
```
### Naming Convention
Finally, you can force a particular naming convention. Even if someone calls the generator
with something like `MyNewModule`, if they have the naming convention set to `:caps`, it will
generate files like `MY_NEW_MODULE.c`. This keeps everyone on your team behaving the same way.
Your options are as follows:
- `:bumpy` - BumpyFilesLooksLikeSo
- `:camel` - camelFilesAreSimilarButStartLow
- `:snake` - snake_case_is_all_lower_and_uses_underscores
- `:caps` - CAPS_FEELS_LIKE_YOU_ARE_SCREAMING

View File

@ -0,0 +1,19 @@
ceedling-raw-output-report
==========================
## Overview
The raw-output-report allows you to capture all the output from the called
tools in a single document, so you can trace back through it later. This is
useful for debugging... but can eat through memory quickly if left running.
## Setup
Enable the plugin in your project.yml by adding `raw_output_report`
to the list of enabled plugins.
``` YAML
:plugins:
:enabled:
- raw_output_report
```

View File

@ -0,0 +1,19 @@
ceedling-stdout-gtestlike-tests-report
======================
## Overview
The stdout_gtestlike_tests_report replaces the normal ceedling "pretty" output with
a variant that resembles the output of gtest. This is most helpful when trying to
integrate into an IDE or CI that is meant to work with google test.
## Setup
Enable the plugin in your project.yml by adding `stdout_gtestlike_tests_report`
to the list of enabled plugins.
``` YAML
:plugins:
:enabled:
- stdout_gtestlike_tests_report
```

View File

@ -0,0 +1,18 @@
ceedling-stdout-ide-tests-report
================================
## Overview
The stdout_ide_tests_report replaces the normal ceedling "pretty" output with
a simplified variant intended to be easily parseable.
## Setup
Enable the plugin in your project.yml by adding `stdout_ide_tests_report`
to the list of enabled plugins.
``` YAML
:plugins:
:enabled:
- stdout_ide_tests_report
```

View File

@ -0,0 +1,20 @@
ceedling-pretty-tests-report
============================
## Overview
The stdout_pretty_tests_report is the default output of ceedling. Instead of
showing most of the raw output of CMock, Ceedling, etc., it shows a simplified
view. It also creates a nice summary at the end of execution which groups the
results into ignored and failed tests.
## Setup
Enable the plugin in your project.yml by adding `stdout_pretty_tests_report`
to the list of enabled plugins.
``` YAML
:plugins:
:enabled:
- stdout_pretty_tests_report
```

View File

@ -0,0 +1,18 @@
ceedling-teamcity-tests-report
==============================
## Overview
The teamcity_tests_report replaces the normal ceedling "pretty" output with
a version that has results tagged to be consumed with the teamcity CI server.
## Setup
Enable the plugin in your project.yml by adding `teamcity_tests_report`
to the list of enabled plugins.
``` YAML
:plugins:
:enabled:
- teamcity_tests_report
```

View File

@ -0,0 +1,19 @@
warnings-report
===============
## Overview
The warnings_report captures all warnings throughout the build process
and collects them into a single report at the end of execution. It places all
of this into a warnings file in the output artifact directory.
## Setup
Enable the plugin in your project.yml by adding `warnings_report`
to the list of enabled plugins.
``` YAML
:plugins:
:enabled:
- warnings_report
```

View File

@ -0,0 +1,11 @@
#
# build script written by : Michael Brockus.
# github repo author: Mark VanderVoord.
#
# license: MIT
#
cexception_dir = include_directories('.')
cexception_lib = static_library(meson.project_name(),
files('CException.c'),
include_directories : cexception_dir)

View File

@ -0,0 +1,85 @@
# ==========================================
# CMock Project - Automatic Mock Generation for C
# Copyright (c) 2007 Mike Karlesky, Mark VanderVoord, Greg Williams
# [Released under MIT License. Please refer to license.txt for details]
# ==========================================
class CMockGeneratorPluginIgnoreStateless
attr_reader :priority
attr_reader :config, :utils
def initialize(config, utils)
@config = config
@utils = utils
@priority = 2
end
def instance_structure(function)
if function[:return][:void?]
" char #{function[:name]}_IgnoreBool;\n"
else
" char #{function[:name]}_IgnoreBool;\n #{function[:return][:type]} #{function[:name]}_FinalReturn;\n"
end
end
def mock_function_declarations(function)
lines = if function[:return][:void?]
"#define #{function[:name]}_Ignore() #{function[:name]}_CMockIgnore()\n" \
"void #{function[:name]}_CMockIgnore(void);\n"
else
"#define #{function[:name]}_IgnoreAndReturn(cmock_retval) #{function[:name]}_CMockIgnoreAndReturn(cmock_retval)\n" \
"void #{function[:name]}_CMockIgnoreAndReturn(#{function[:return][:str]});\n"
end
# Add stop ignore function. it does not matter if there are any args
lines << "#define #{function[:name]}_StopIgnore() #{function[:name]}_CMockStopIgnore()\n" \
"void #{function[:name]}_CMockStopIgnore(void);\n"
lines
end
def mock_implementation_precheck(function)
lines = " if (Mock.#{function[:name]}_IgnoreBool)\n {\n"
lines << " UNITY_CLR_DETAILS();\n"
if function[:return][:void?]
lines << " return;\n }\n"
else
retval = function[:return].merge(:name => 'cmock_call_instance->ReturnVal')
lines << " if (cmock_call_instance == NULL)\n return Mock.#{function[:name]}_FinalReturn;\n"
lines << ' ' + @utils.code_assign_argument_quickly("Mock.#{function[:name]}_FinalReturn", retval) unless retval[:void?]
lines << " return cmock_call_instance->ReturnVal;\n }\n"
end
lines
end
# this function is adjusted
def mock_interfaces(function)
lines = ''
lines << if function[:return][:void?]
"void #{function[:name]}_CMockIgnore(void)\n{\n"
else
"void #{function[:name]}_CMockIgnoreAndReturn(#{function[:return][:str]})\n{\n"
end
unless function[:return][:void?]
lines << " Mock.#{function[:name]}_CallInstance = CMOCK_GUTS_NONE;\n"
lines << " Mock.#{function[:name]}_FinalReturn = cmock_to_return;\n"
end
lines << " Mock.#{function[:name]}_IgnoreBool = (char)1;\n"
lines << "}\n\n"
# Add stop ignore function. it does not matter if there are any args
lines << "void #{function[:name]}_CMockStopIgnore(void)\n{\n"
lines << " Mock.#{function[:name]}_IgnoreBool = (char)0;\n"
lines << "}\n\n"
lines
end
def mock_ignore(function)
" Mock.#{function[:name]}_IgnoreBool = (char)1;\n"
end
def mock_verify(function)
func_name = function[:name]
" if (Mock.#{func_name}_IgnoreBool)\n call_instance = CMOCK_GUTS_NONE;\n"
end
end