Infrastructure for Partially Observable Markov Decision Processes (POMDP)


[Up] [Top]

Documentation for package ‘pomdp’ version 1.2.2

Help Pages

absorbing_states Reachable and Absorbing States
accessors Access to Parts of the Model Description
actions Available Actions
add_policy Add a Policy to a POMDP Problem Description
Cliff_walking Cliff Walking Gridworld MDP
cliff_walking Cliff Walking Gridworld MDP
colors Default Colors for Visualization in Package pomdp
colors_continuous Default Colors for Visualization in Package pomdp
colors_discrete Default Colors for Visualization in Package pomdp
curve_multiple_directed POMDP Plot Policy Graphs
epoch_to_episode Define a POMDP Problem
estimate_belief_for_nodes Estimate the Belief for Policy Graph Nodes
greedy_MDP_action Functions for MDP Policies
greedy_MDP_policy Functions for MDP Policies
gridworld Helper Functions for Gridworld MDPs
gridworld_animate Helper Functions for Gridworld MDPs
gridworld_init Helper Functions for Gridworld MDPs
gridworld_matrix Helper Functions for Gridworld MDPs
gridworld_maze_MDP Helper Functions for Gridworld MDPs
gridworld_plot_policy Helper Functions for Gridworld MDPs
gridworld_plot_transition_graph Helper Functions for Gridworld MDPs
gridworld_rc2s Helper Functions for Gridworld MDPs
gridworld_s2rc Helper Functions for Gridworld MDPs
is_converged_POMDP Define a POMDP Problem
is_solved_MDP Define an MDP Problem
is_solved_POMDP Define a POMDP Problem
is_timedependent_POMDP Define a POMDP Problem
make_fully_observable Convert between MDPs and POMDPs
make_partially_observable Convert between MDPs and POMDPs
manual_MDP_policy Functions for MDP Policies
Maze Steward Russell's 4x3 Maze Gridworld MDP
maze Steward Russell's 4x3 Maze Gridworld MDP
MDP Define an MDP Problem
MDP2POMDP Convert between MDPs and POMDPs
MDP_policy_evaluation Functions for MDP Policies
MDP_policy_functions Functions for MDP Policies
normalize_MDP Access to Parts of the Model Description
normalize_POMDP Access to Parts of the Model Description
observation_matrix Access to Parts of the Model Description
observation_val Access to Parts of the Model Description
optimal_action Optimal action for a belief
O_ Define a POMDP Problem
plot_belief_space Plot a 2D or 3D Projection of the Belief Space
plot_policy_graph POMDP Plot Policy Graphs
plot_transition_graph Transition Graph
plot_value_function Value Function
policy Extract the Policy from a POMDP/MDP
policy_graph POMDP Policy Graphs
POMDP Define a POMDP Problem
POMDP_example_files POMDP Example Files
projection Defining a Belief Space Projection
q_values_MDP Functions for MDP Policies
random_MDP_policy Functions for MDP Policies
reachable_and_absorbing Reachable and Absorbing States
reachable_states Reachable and Absorbing States
read_POMDP Read and write a POMDP Model to a File in POMDP Format
regret Calculate the Regret of a Policy
remove_unreachable_states Reachable and Absorbing States
reward Calculate the Reward for a POMDP Solution
reward_matrix Access to Parts of the Model Description
reward_node_action Calculate the Reward for a POMDP Solution
reward_val Access to Parts of the Model Description
round_stochastic Round a stochastic vector or a row-stochastic matrix
RussianTiger Russian Tiger Problem POMDP Specification
R_ Define a POMDP Problem
sample_belief_space Sample from the Belief Space
simulate_MDP Simulate Trajectories in a MDP
simulate_POMDP Simulate Trajectories in a POMDP
solve_MDP Solve an MDP Problem
solve_MDP_DP Solve an MDP Problem
solve_MDP_TD Solve an MDP Problem
solve_POMDP Solve a POMDP Problem using pomdp-solver
solve_POMDP_parameter Solve a POMDP Problem using pomdp-solver
solve_SARSOP Solve a POMDP Problem using SARSOP
start_vector Access to Parts of the Model Description
Three_doors Tiger Problem POMDP Specification
Tiger Tiger Problem POMDP Specification
transition_graph Transition Graph
transition_matrix Access to Parts of the Model Description
transition_val Access to Parts of the Model Description
T_ Define a POMDP Problem
update_belief Belief Update
value_function Value Function
Windy_gridworld Windy Gridworld MDP
windy_gridworld Windy Gridworld MDP
write_POMDP Read and write a POMDP Model to a File in POMDP Format