Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NodePattern: debugging tools #109

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
14 changes: 13 additions & 1 deletion docs/modules/ROOT/pages/node_pattern_compiler.adoc
Expand Up @@ -70,7 +70,7 @@ The `Lexer` emits tokens with types that are:
* symbols of the form `:tTOKEN_TYPE` for the rest (e.g.
`:tPREDICATE`)

Tokens are stored as `[type, value]`.
Tokens are stored as `[type, value]`, or `[type, [value, location]]` if locations are emitted.

[discrete]
==== Generation
Expand Down Expand Up @@ -238,3 +238,15 @@ see `Node#in_sequence_head`)
==== Precedence

Like the node pattern subcompiler, it generates code that has higher or equal precedence to `&&`, so as to make chaining convenient.

== Variant: WithMeta

These variants of the Parser / Builder / Lexer generate `location` information (exactly like the `parser` gem) for AST nodes as well as comments with their locations (like the `parser` gem).

Since this information is not typically used when one ony wants to define methods, it is not loaded by default.

== Variant: Debug

These variants of the Compiler / Subcompilers works by adding tracing code before and after each compilation of `NodePatternSubcompiler` and `SequenceSubcompiler`.
A unique ID is assigned to each node and the tracing code flips a corresponding switch when the expression is about to be evaluated, and after (joined with `&&` so it only flips the switch if the node was a match).
Atoms are not compiled differently as they are not really matchable (when not compiled as a node pattern)
3 changes: 3 additions & 0 deletions lib/rubocop/ast.rb
Expand Up @@ -79,3 +79,6 @@
require_relative 'ast/token'
require_relative 'ast/traversal'
require_relative 'ast/version'

::RuboCop::AST::NodePattern::Parser.autoload :WithMeta, "#{__dir__}/ast/node_pattern/with_meta"
::RuboCop::AST::NodePattern::Compiler.autoload :Debug, "#{__dir__}/ast/node_pattern/compiler/debug"
168 changes: 168 additions & 0 deletions lib/rubocop/ast/node_pattern/compiler/debug.rb
@@ -0,0 +1,168 @@
# frozen_string_literal: true

require 'rainbow'

module RuboCop
module AST
class NodePattern
class Compiler
# Variant of the Compiler with tracing information for nodes
class Debug < Compiler
# Compiled node pattern requires a named parameter `trace`,
# which should be an instance of this class
class Trace
def initialize
@visit = {}
end

def enter(node_id)
@visit[node_id] = false
true
end

def success(node_id)
@visit[node_id] = true
end

# return nil (not visited), false (not matched) or true (matched)
def matched?(node_id)
@visit[node_id]
end
end

attr_reader :node_ids

# @api private
class Colorizer
COLOR_SCHEME = {
not_visitable: :lightseagreen,
nil => :yellow,
false => :red,
true => :green
}.freeze

# Result of a NodePattern run against a particular AST
# Consider constructor is private
Result = Struct.new(:colorizer, :trace, :returned, :ruby_ast) do # rubocop:disable Metrics/BlockLength
# @return [String] a Rainbow colorized version of ruby
def colorize(color_scheme = COLOR_SCHEME)
map = color_map(color_scheme)
ast.loc.expression.source_buffer.source.chars.map.with_index do |char, i|
Rainbow(char).color(map[i])
end.join
end

# @return [Hash] a map for {character_position => color}
def color_map(color_scheme = COLOR_SCHEME)
@color_map ||=
match_map
.transform_values { |matched| color_scheme.fetch(matched) }
.map { |node, color| color_map_for(node, color) }
.inject(:merge)
.tap { |h| h.default = color_scheme.fetch(:not_visitable) }
end

# @return [Hash] a map for {node => matched?}, depth-first
def match_map
@match_map ||=
ast
.each_node
.to_h { |node| [node, matched?(node)] }
end

# @return a value of `Trace#matched?` or `:not_visitable`
def matched?(node)
id = colorizer.compiler.node_ids.fetch(node) { return :not_visitable }
trace.matched?(id)
end

private

def color_map_for(node, color)
return {} unless (range = node.loc&.expression)

range.to_a.to_h { |char| [char, color] }
end

def ast
colorizer.node_pattern.ast
end
end

Compiler = Debug

attr_reader :pattern, :compiler, :node_pattern

def initialize(pattern, compiler: self.class::Compiler.new)
@pattern = pattern
@compiler = compiler
@node_pattern = ::RuboCop::AST::NodePattern.new(pattern, compiler: @compiler)
end

# @return [Node] the Ruby AST
def test(ruby, trace: self.class::Compiler::Trace.new)
ruby = ruby_ast(ruby) if ruby.is_a?(String)
returned = @node_pattern.as_lambda.call(ruby, trace: trace)
self.class::Result.new(self, trace, returned, ruby)
end

private

def ruby_ast(ruby)
buffer = ::Parser::Source::Buffer.new('(ruby)', source: ruby)
ruby_parser.parse(buffer)
end

def ruby_parser
require 'parser/current'
builder = ::RuboCop::AST::Builder.new
::Parser::CurrentRuby.new(builder)
end
end

def initialize
super
@node_ids = Hash.new { |h, k| h[k] = h.size }.compare_by_identity
end

def named_parameters
super << :trace
end

def parser
@parser ||= Parser::WithMeta.new
end

def_delegators :parser, :comments, :tokens

# @api private
module InstrumentationSubcompiler
def do_compile
"#{tracer(:enter)} && #{super} && #{tracer(:success)}"
end

private

def tracer(kind)
"trace.#{kind}(#{node_id})"
end

def node_id
compiler.node_ids[node]
end
end

# @api private
class NodePatternSubcompiler < Compiler::NodePatternSubcompiler
include InstrumentationSubcompiler
end

# @api private
class SequenceSubcompiler < Compiler::SequenceSubcompiler
include InstrumentationSubcompiler
end
end
end
end
end
end
111 changes: 111 additions & 0 deletions lib/rubocop/ast/node_pattern/with_meta.rb
@@ -0,0 +1,111 @@
# frozen_string_literal: true

module RuboCop
module AST
class NodePattern
class Parser
# Overrides Parser to use `WithMeta` variants and provide additional methods
class WithMeta < Parser
# Overrides Lexer to token locations and comments
class Lexer < NodePattern::Lexer
attr_reader :source_buffer

def initialize(str_or_buffer)
@source_buffer = if str_or_buffer.respond_to?(:source)
str_or_buffer
else
::Parser::Source::Buffer.new('(string)', source: str_or_buffer)
end
@comments = []
super(@source_buffer.source)
end

def token(type, value)
super(type, [value, pos])
end

def emit_comment
@comments << Comment.new(pos)
super
end

# @return [::Parser::Source::Range] last match's position
def pos
::Parser::Source::Range.new(source_buffer, ss.pos - ss.matched_size, ss.pos)
end
end

# Overrides Builder to emit nodes with locations
class Builder < NodePattern::Builder
def emit_atom(type, token)
value, loc = token
begin_l = loc.resize(1)
end_l = loc.end.adjust(begin_pos: -1)
begin_l = nil if begin_l.source.match?(/\w/)
end_l = nil if end_l.source.match?(/\w/)
n(type, [value], source_map(token, begin_t: begin_l, end_t: end_l))
end

def emit_unary_op(type, operator_t = nil, *children)
children[-1] = children[-1].first if children[-1].is_a?(Array) # token?
map = source_map(children.first.loc.expression, operator_t: operator_t)
n(type, children, map)
end

def emit_list(type, begin_t, children, end_t)
expr = children.first.loc.expression.join(children.last.loc.expression)
map = source_map(expr, begin_t: begin_t, end_t: end_t)
n(type, children, map)
end

def emit_call(type, selector_t, args = nil)
selector, = selector_t
begin_t, arg_nodes, end_t = args

map = source_map(selector_t, begin_t: begin_t, end_t: end_t, selector_t: selector_t)
n(type, [selector, *arg_nodes], map)
end

private

def n(type, children, source_map)
super(type, children, { location: source_map })
end

def loc(token_or_range)
return token_or_range[1] if token_or_range.is_a?(Array)

token_or_range
end

def join_exprs(left_expr, right_expr)
left_expr.loc.expression
.join(right_expr.loc.expression)
end

def source_map(token_or_range, begin_t: nil, end_t: nil, operator_t: nil, selector_t: nil)
expression_l = loc(token_or_range)
expression_l = expression_l.expression if expression_l.respond_to?(:expression)
locs = [begin_t, end_t, operator_t, selector_t].map { |token| loc(token) }
begin_l, end_l, operator_l, selector_l = locs

expression_l = locs.compact.inject(expression_l, :join)

::Parser::Source::Map::Send.new(_dot_l = nil, selector_l, begin_l, end_l, expression_l)
.with_operator(operator_l)
end
end

attr_reader :comments, :tokens

def do_parse
r = super
@comments = @lexer.comments
@tokens = @lexer.tokens
r
end
end
end
end
end
end
25 changes: 24 additions & 1 deletion spec/rubocop/ast/node_pattern/helper.rb
@@ -1,5 +1,28 @@
# frozen_string_literal: true

require_relative 'parse_helper'

Failure = Struct.new(:expected, :actual)

module NodePatternHelper
include ParseHelper

def assert_equal(expected, actual, mess = nil)
expect(actual).to eq(expected), *mess
end

def assert(test, mess = nil)
expect(test).to eq(true), *mess
end

def expect_parsing(ast, source, source_maps)
version = '-'
try_parsing(ast, source, parser, source_maps, version)
end
end

RSpec.shared_context 'parser' do
let(:parser) { RuboCop::AST::NodePattern::Parser.new }
include NodePatternHelper

let(:parser) { RuboCop::AST::NodePattern::Parser::WithMeta.new }
end
7 changes: 4 additions & 3 deletions spec/rubocop/ast/node_pattern/lexer_spec.rb
Expand Up @@ -2,7 +2,7 @@

RSpec.describe RuboCop::AST::NodePattern::Lexer do
let(:source) { '(send nil? #func(:foo) #func (bar))' }
let(:lexer) { RuboCop::AST::NodePattern::Parser::Lexer.new(source) }
let(:lexer) { RuboCop::AST::NodePattern::Parser::WithMeta::Lexer.new(source) }
let(:tokens) do
tokens = []
while (token = lexer.next_token)
Expand All @@ -12,9 +12,10 @@
end

it 'provides tokens via next_token' do # rubocop:disable RSpec/ExampleLength
type, (text, _range) = tokens[3]
type, (text, range) = tokens[3]
expect(type).to eq :tFUNCTION_CALL
expect(text).to eq :func
expect(range.to_range).to eq 11...16

expect(tokens.map(&:first)).to eq [
'(',
Expand All @@ -31,7 +32,7 @@
let(:source) { '(array sym $int+ x)' }

it 'works' do
expect(tokens.map(&:last)).to eq \
expect(tokens.map(&:last).map(&:first)).to eq \
%i[( array sym $ int + x )]
end
end
Expand Down