[ldv-project] Launching Klever
Ilja Zakharov
ilja.zakharov at ispras.ru
Wed Nov 7 12:07:18 MSK 2018
Dear Ahmet,
ср, 7 нояб. 2018 г. в 8:04, A. Celenk <ahmet.celenk at procenne.com>:
> Hello Ilja,
>
> Thanks a lot for great instructions! I'll be trying v2.0 in a convenient
> time. By the way, reading through the steps, some questions popped up in my
> mind:
>
> - I assume we are downloading kernel 3.14.79 to verify it (not to use
> it), so why do we build it? Isn't Klever performing source code level
> analysis?
>
There are two reasons for this. The first one is technical. Klever requires
compilation commands to weave the source code of the program under
verification to add generated environment models and assertions to check
requirements (this is done aside from the provided source code). The second
one is that Klever decomposes the program into fragments and generates
environment models before verification, this step needs to extract a lot of
information about the program (global variables, a callgraph). Clade
collects all necessary information for Klever in this case.
Klever performs source code level analysis but it requires quite good
environment models since it analyses not just separate independent
functions but possible execution paths from entry points (main function,
device driver initialization functions). For instance, for device drivers
an environment model helps the software verification tool used in Klever as
a backend to take into account that at initialization the driver registers
callbacks and that these callbacks are called later with an appropriate
order described in Klever environment model specifications.
> - Are we going to able to test our own drivers with this release?
>
> Yes, this is possible. The simplest way is to place your drivers in kernel
sources and compile them with the Clade as it can be done for any other
Linux module or subsystem. If you prefer to compile them in a separate
directory aside from the kernel source directory, then it is also possible
but you need to add an extra path to "working source trees.json" (see the
steps I described).
But note, that depending on the Linux kernel you are using you will need to
adjust job.json in the web interface to choose the most suitable set of
specifications. Also provided with the release specifications do not
equally well support various Linux device drivers types.
>
>
> - Are we going to able to test just a simple e.g. "main.c" file; or do
> we need a kernel?
>
> This is also possible, but Klever is intended for sound software
verification to automatically prove program correctness relatively
explicitly stated formalized requirements (this is not yet another bug
finding tool). Used inside Klever software verification tools need to know
about undefined functions (even for some functions from stdlib) and about
how the program can interact with its environment (OS, other programs) to
complete their proofs. In practice, this means that to obtain good results
and get guarantees that your program is correct one need to provide models
of some functions, describe program decomposition into fragments (if the
program is larger than 100 thousands LoC) and do other steps that can be
laborious sometimes.
> Kind Regards,
>
> Ahmet
> On 6.11.2018 18:50, Ilja Zakharov wrote:
>
> Dear Ahmet,
>
> As Evgeny Novikov promised (he is giving a talk at a conference), below is
> the short guide for deploying and running the Klever 2.0 software
> verification framework.
>
> 1. Choose your favorite Ubuntu/Debian image. The instruction below is
> prepared for Ubuntu 18 but it not strictly required.
> 2. Download and create a new desktop installation or virtual machine with
> the system. Don't forget about our hardware requirements
> <https://klever.readthedocs.io/en/latest/deploy.html#hardware-requirements>
> !
> 3. Download our software verification framework:
> sudo apt-get install git
> git clone https://github.com/ldv-klever/klever.git
>
> 4. Then follow deployment steps
> <https://klever.readthedocs.io/en/latest/deploy_local.html> for the local
> deployment. In particular modify klever-minimal.json.sample and save it as
> klever.json in $KLEVER_SRC/deployment/conf/ directory. Also set environment
> variables to specify $KLEVER_SRC directory with Klever source files and
> target installation directory $KLEVER_DEPLOY_DIR. Then install Klever with
> the command:
> sudo $KLEVER_SRC/deploys/bin/deploy-local --deployment-directory
> $KLEVER_DEPLOY_DIR install production
>
> 5. After Klever installation you can build the kernel and create build
> base.
>
> Download the kernel:
> wget https://cdn.kernel.org/pub/linux/kernel/v3.x/linux-3.14.79.tar.gz
> tar -xf linux-3.14.79.tar.gz
>
> Install a suitable compiler:
> sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-4.8 10
> sudo update-alternatives --config gcc
> And choose 4.8 compiler there
>
> Prepare your PATH environment variable:
> export PATH=$KLEVER_DEPLOY_DIR/klever-addons/CIF/bin/:$PATH
>
> Then run the following commands. It will take a lot of time
> make allmodconfig
> clade-intercept make -j8 modules_prepare
> clade-intercept -a make -j8 all
> clade-all -p linux_kernel ./cmds.txt
> echo "[\"$(readlink -f .)\"]" > ./clade/Storage/working source trees.json
>
> 6. Move this build base to another place. This is an optional step but if
> you are going to build more bases in this directory then it is better to do
> it. Note, then you need a new base for each new configuration or kernel
> version.
> mv clade $NEW_PLACE/linux-3.14.79
>
> 7. Modify your deployment klever.json file in $KLEVER_DEPLOY_DIR and add
> there a new section at the end of the file in the root JSON object:
> "Klever Build Bases": ["$NEW_PLACE/linux-3.14.79"]
>
> Instead of $NEW_PLACE set an absolute path to the directory with your
> build base.
>
> 8 Update the Klever installation:
> sudo $KLEVER_SRC/deploys/bin/deploy-local --deployment-directory
> $KLEVER_DEPLOY_DIR update production
>
> 9. Now you can run verification.
> Open localhost:8998 in the web browser.
> Input username: manager, password: manager.
> Click "Linux 3.14".
> Click Job->Edit or ->Copy and edit job.json and other files depending on
> your needs.
> In the job.json set "build base": "linux-3.14.79" or an another name if
> you named your build base differently.
> Before running any large target such as all modules first try to verify
> modules given with the example.
> 10. Click Decision->Start.
> 11. Click "Start" or "Start with default values".
> 13. Wait for verification results. It can take very much time especially
> if your hardware isn't powerful enough.
>
> If you will meet some issues, please, report them to us. We will try to
> solve or workaround them together. Note also, that we renamed some files,
> so there are some differences from the Klever-1.1 version in job files.
>
> Best,
> Ilia Zakharov
>
>
> ср, 17 окт. 2018 г. в 10:40, Evgeny Novikov <novikov at ispras.ru>:
>
>> Dear Ahmet,
>>
>> Your primary question will not be resolved in Klever 2.0. So, I will
>> answer it.
>>
>> In the deployment documentation you can find hardware requirements
>> <https://klever.readthedocs.io/en/latest/deploy.html#hardware-requirements>.
>> I suggest that your VM does not meet them, so, you either should empower it
>> (I recommend this to get better verification results faster) or use more
>> advanced settings. At page http://localhost:8998/service/schedulers/ you
>> can find the maximum amount of computer resources that Klever can spend for
>> solving verification jobs and verification tasks. In the file you attached
>> there are restrictions on computer resources just for verification jobs.
>> But verification tasks are solved during solving verification jobs, so you
>> should also take into account maximum computer resources that are allowed
>> for verification tasks. Usually they are much more than ones for
>> verification jobs! These resources are specified within file "tasks.json"
>> at the job page, say, http://ldvdev:8998/jobs/5/. I hope that this is
>> enough for you to proceed.
>>
>> For answering your second question it is definitely necessary to wait for
>> Klever 2.0 as it will be intended for verification of C software rather
>> than Klever 1.x that can verify just Linux device drivers. But, please, do
>> not expect that Klever 2.0 will be able to verify some C program without
>> additional efforts. Our rich experience and preliminary experiments show
>> that one will need to develop a suitable configuration and specifications
>> for each type of programs while verification tools like CPAchecker do not
>> handle unfamiliar code well sometimes. If you are ready to invest much
>> time, you are welcome! We will be glad to hear some notes and remarks from
>> you.
>>
>> --
>> Evgeny Novikov
>> Linux Verification Center, ISP RAS
>> http://linuxtesting.org
>>
>>
>>
>> 17.10.2018, 09:12, "A. Celenk" <ahmet.celenk at procenne.com>:
>>
>> Hello Evgeny,
>>
>> Thank you very much for Klever, and for your help. I will be eagerly
>> waiting for 2.0 release. Till then, I may try to use the current version to
>> learn some basics. Here is what I have done:
>>
>> I could reach the web interface and managed to login as manager, and
>> played around a little bit. Considered the warning on the deployment
>> tutorial about messing up the computer, I have installed Klever on a Ubuntu
>> 18.04 VM. Being a VM, it has low resources. I have started the preset Linux
>> 3.14 job with default configurations and got "Given resource limits for job
>> and tasks in sum are too high, we do not have such amount of resources"
>> error. Then I have checked out manual configuration and used "paranoid
>> development" preset configuration, and lowered the resources as much as I
>> can (see my attached configuration). Nevertheless I am still getting the
>> same error and it fails.
>>
>> Here are my questions for the current release; if answers would greatly
>> change for v2.0, just ignore these and I will be waiting for the new one:
>>
>> - Is it fine to use Klever in such a low-resourced VM? If so, what my
>> configurations should be? If not, what are the minimum system requirements
>> for Klever?
>> - How do I upload a new job (e.g. just a single C file, or a project
>> with makefile)?
>>
>> Thank you again,
>>
>> Regards,
>>
>> Ahmet
>>
>> On 16-10-2018 18:00, Evgeny Novikov wrote:
>>
>> Also, I would like to inform you that in a couple of weeks we will
>> release Klever 2.0 that will change very many things related with core
>> functionalities. If you have time, then I recommend to wait for this
>> release and try it rather than a version that will be deprecated soon.
>>
>> 16:08, 16 октября 2018 г., Evgeny Novikov <novikov at ispras.ru>
>> <novikov at ispras.ru>:
>>
>>
>> 16.10.2018, 13:55, "Evgeny Novikov" <novikov at ispras.ru>:
>>
>> Hello Ahmed,
>>
>> Your question is absolutely right. Klever lacks user documentation and I
>> think that it will not have it in a near future. So, if you would like to
>> proceed, most likely you will have many questions.
>>
>> You shouldn't run any script to start Klever if deployment was
>> successful. You just need to open localhost:8998 in your web browser.
>>
>>
>> I forgot to mention that there are 3 users created in advance:
>>
>> - The first one is Administrator (login: admin, password: admin) that
>> can change user privileges. Most likely you will not need it unlike you are
>> going to create a multiuser setup.
>> - The second one is Manager (login: manager, password: manager). This
>> user can almost everything and you should start with it!
>> - The third one is Service user that serves for internal purposes.
>>
>>
>>
>>
>> If you will have any other questions, please, ask us. We will be glad to
>> get some feedback from you.
>>
>> --
>> Evgeny Novikov
>> Linux Verification Center, ISP RAS
>> http://linuxtesting.org
>>
>>
>>
>> 16.10.2018, 11:14, "A. Celenk" <ahmet.celenk at procenne.com>:
>>
>> Hello,
>>
>> I will ask a dumb question: How do we run Klever? I've completed the
>> deployment; but couldn't figure out which python script (or some other
>> thing) should I run to start testing.
>>
>> P.S: I've used example klever-minimal.json.sample as configuration file.
>>
>> Thanks a lot,
>>
>> Ahmet Celenk
>>
>>
>> _______________________________________________
>> ldv-project mailing list
>> ldv-project at linuxtesting.org
>> http://linuxtesting.org/cgi-bin/mailman/listinfo/ldv-project
>>
>> _______________________________________________
>> ldv-project mailing list
>> ldv-project at linuxtesting.org
>> http://linuxtesting.org/cgi-bin/mailman/listinfo/ldv-project
>>
>>
--
Ilia Zakharov
Linux Verification Center, ISP RAS
http://linuxtesting.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://linuxtesting.org/pipermail/ldv-project/attachments/20181107/1f136d5c/attachment.html>
More information about the ldv-project
mailing list