Human Generated Data

Title

Untitled (two men talking on tractor in field)

Date

1951-1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6422

Human Generated Data

Title

Untitled (two men talking on tractor in field)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1951-1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6422

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Apparel 99.7
Clothing 99.7
Human 98.4
Person 98.4
Person 97.6
Hat 96.8
Sun Hat 74.3
Hat 66.2
Face 59.6
Coat 56.1
Overcoat 56.1

Clarifai
created on 2019-03-22

people 99.6
man 97.3
adult 96.7
one 93.2
two 92.8
vehicle 92.5
lid 91.6
wear 88
transportation system 87.9
woman 87.3
actor 84.1
three 82.6
veil 81.5
recreation 80.8
group together 80.4
sitting 79.4
monochrome 76
outfit 75.5
watercraft 69
group 68.1

Imagga
created on 2019-03-22

person 21.7
man 17.5
technology 16.3
black 16.2
device 15.3
metal 15.3
television 14.8
male 14.2
aviator 14.2
modern 12.6
working 12.4
people 12.3
light 11.3
equipment 11.1
lifestyle 10.8
steel 10.6
computer 10.5
human 10.5
helmet 9.9
one 9.7
style 9.6
laptop 9.3
sport 9.2
work 8.9
science 8.9
broadcasting 8.8
body 8.8
art 8.7
play 8.6
portrait 8.4
adult 8.4
worker 7.8
render 7.8
glass 7.8
men 7.7
studio 7.6
hand 7.6
glasses 7.4
competition 7.3
protection 7.3
information 7.1
newspaper 7.1

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

person 98.9
outdoor 85.4
player 73.3
black and white 73.3
monochrome 33.3
bicycle 26.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Female, 64.4%
Angry 15.5%
Surprised 9.2%
Disgusted 15.3%
Sad 19%
Calm 14.1%
Happy 11.8%
Confused 15.1%

AWS Rekognition

Age 26-43
Gender Female, 92.2%
Disgusted 1.5%
Sad 0.5%
Confused 0.4%
Angry 0.6%
Calm 94.6%
Happy 1.4%
Surprised 0.9%

Feature analysis

Amazon

Person 98.4%
Hat 96.8%

Captions

Microsoft
created on 2019-03-22

a person sitting at a table with a racket 41.3%
a person sitting on a bench 41.2%