Human Generated Data

Title

Untitled (two men talking on tractor in field)

Date

1951-1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6423

Human Generated Data

Title

Untitled (two men talking on tractor in field)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1951-1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6423

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Apparel 99.9
Clothing 99.9
Person 98.7
Human 98.7
Person 98.4
Hat 96.8
Hat 87.2
Sun Hat 81.2
Coat 75.1
Bike 71.1
Transportation 71.1
Vehicle 71.1
Bicycle 71.1
Overcoat 68.1
Cowboy Hat 62.9
Motorcycle 62.8
Face 59.4
Tire 55.7

Clarifai
created on 2019-03-22

people 98.9
man 96.8
one 96.3
adult 95.9
actor 92.3
lid 92
vehicle 89
veil 87
wear 86.5
two 86
sitting 83.7
woman 82.2
music 82.1
transportation system 81.7
war 78.9
portrait 78.1
uniform 77.6
outfit 77.4
military 77.4
instrument 76.6

Imagga
created on 2019-03-22

dishwasher 36.6
white goods 29.3
computer 25.7
laptop 24.8
home appliance 23.9
working 23
people 21.7
person 21.3
work 21.2
technology 20.8
business 18.8
appliance 17
adult 16.8
man 16.8
worker 16
office 15.4
professional 14.3
smiling 13
car 13
smile 12.8
device 12.7
happy 12.5
job 12.4
male 12
attractive 11.9
looking 11.2
pretty 10.5
one 10.4
home 10.4
portrait 10.3
sitting 10.3
black 10.3
communication 10.1
businesswoman 10
corporate 9.4
occupation 9.2
human 9
desk 8.7
brunette 8.7
travel 8.4
modern 8.4
shirt 8.4
help 8.4
hand 8.3
durables 8.3
monitor 8.2
support 8.1
hair 7.9
cute 7.9
look 7.9
standing 7.8
notebook 7.8
men 7.7
luxury 7.7
automobile 7.7
engineer 7.6
vehicle 7.6
holding 7.4
equipment 7.4
product 7.3
20s 7.3
alone 7.3
newspaper 7.2
success 7.2
lifestyle 7.2
indoors 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

racket 96.1
outdoor 94.7
person 94.1
player 82
black and white 82
art 43.1
monochrome 35.2
music 24.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-53
Gender Female, 69.9%
Surprised 9.3%
Happy 12.1%
Angry 9.5%
Confused 5.1%
Disgusted 16.2%
Sad 7.8%
Calm 40%

AWS Rekognition

Age 35-52
Gender Female, 73.1%
Confused 9%
Happy 5.7%
Angry 16.8%
Disgusted 4.3%
Calm 27.6%
Sad 20.8%
Surprised 15.7%

Feature analysis

Amazon

Person 98.7%
Hat 96.8%
Bicycle 71.1%
Motorcycle 62.8%

Captions

Text analysis

Amazon

KODIK-