Wednesday, September 18, 2013

ViewObject: getFilteredRows

there is a function :getFilteredRows(RowQualifier rowQualifier) in the api of the ViewObjectImpl class.
you can create a RowQualifier simply by using:
new RowQualifier("attribute = '"+val+"'");

The allowed conditions can be only simple operators like =, <>, > and <. It does not take other functions like NVL, IS NULL, IS NOT NULL etc. This is not specified in any documentation.

Example:
HR schema, table:Employees.
in the EmployeesViewImpl add the following:
    public Row[] getEmployeesWithJob(String job){
        RowQualifier qualifier=new RowQualifier("JobId ='"+job+"'");
        return getFilteredRows(qualifier);

    }

then call the above method from your client code:
appModule.getEmployeesView1().getEmployeesWithJob("AD_VP");

this will return a Row array with the rows that matches the condition.

but after some experimenting I found that this code may introduce a sql injection threat, for example:
appModule.getEmployeesView1().getEmployeesWithJob("AD_VP' OR ''='");

this will return all rows in the view object.

so if the input of the function is supplied by the user ,this will be a serious security risk.

tested on JDeveloper 11.1.1.5

Edited on:19/9/2013:
concerning the underlined paragraph above,
I jumped to conclusions when I tried it,   I am sorry about that.
here is my explanation after I did some more digging:
using the tracing tools of the oracle database; I think it's safe to conclude the following:
1)the filtering is done in memory, the resulting query is not modified to include the condition.
2)as result of 1, I don't think it's correct to call this threat an SQL injection threat. However, this does not mean the issue of injection is not there.
3)as a result of 1, all rows of the table will be fetched into memory. This may result in performance issues, specially for table with a huge number of rows.

2 comments:

  1. There is a performance issue of using RowQualifier it appears when you have a lot of rows.

    ReplyDelete
    Replies
    1. as I said in the part that was added later on,
      the filtering is done in memory, which means, all rows will be fetched in memory then filtered, which can create a huge load on the database and the network between the middle tier and database,

      Delete