Human Generated Data

Title

Untitled (two men in overalls sitting in dark work room)

Date

1955

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11098

Human Generated Data

Title

Untitled (two men in overalls sitting in dark work room)

People

Artist: Claseman Studio, American 20th century

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Furniture 100
Person 99.3
Human 99.3
Person 99.2
Chair 98.3
Chair 96.9
Shoe 88.8
Apparel 88.8
Footwear 88.8
Clothing 88.8
Shoe 88.4
Shoe 85.9
Table 84.3
Clinic 81.9
Room 68.9
Indoors 68.9
Helmet 66.8
People 63.9
Dining Table 60
Photo 58.6
Photography 58.6
Workshop 57.4
Shorts 56.8
Art 55.6
Drawing 55.6

Clarifai
created on 2019-03-25

people 100
group together 99.1
group 98.8
adult 98.4
furniture 97.3
man 95.6
two 95.4
several 95
woman 94.9
three 94.3
vehicle 93.7
many 92.5
seat 89.9
watercraft 89.8
four 89.8
room 87.6
wear 87.5
child 86.1
aircraft 82.8
recreation 82.3

Imagga
created on 2019-03-25

brass 87
trombone 82.6
wind instrument 70.9
musical instrument 47.6
man 20.8
sax 19.2
male 19.1
chair 18.4
people 17.8
business 15.8
group 15.3
work 14.9
black 13.8
person 13.5
technology 12.6
room 12.4
stage 11.7
businessman 11.5
table 11.2
engineer 10.3
sitting 10.3
men 10.3
adult 10
job 9.7
interior 9.7
concert 9.7
music 9.6
chart 9.5
building 9.1
modern 9.1
cornet 9
body 8.8
equipment 8.7
boy 8.7
construction 8.5
studio 8.3
silhouette 8.3
human 8.2
blackboard 8.1
classroom 8.1
teacher 8.1
working 7.9
women 7.9
design 7.9
teaching 7.8
play 7.7
musical 7.7
sky 7.6
engineering 7.6
pencil 7.6
professional 7.4
company 7.4
light 7.3
computer 7.2
worker 7.2
drawing 7.1
device 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

person 94.4
sport 89.1
black and white 60.9
boy 44.7
child 37.9
monochrome 20.6

Face analysis

Amazon

AWS Rekognition

Age 27-44
Gender Male, 98.9%
Surprised 1.4%
Angry 2.3%
Sad 5.6%
Calm 77.5%
Happy 1%
Confused 2.4%
Disgusted 9.8%

AWS Rekognition

Age 26-43
Gender Male, 98.2%
Happy 28.7%
Sad 5.3%
Angry 3.2%
Disgusted 1.5%
Surprised 3.8%
Calm 51.6%
Confused 5.9%

Feature analysis

Amazon

Person 99.3%
Chair 98.3%
Shoe 88.8%
Helmet 66.8%

Captions

Microsoft

a group of people sitting in a chair 80.4%
a group of people sitting in chairs 80.3%
a group of people sitting on a chair 77.7%

Text analysis

Amazon

:
: WO MOS
WO
MOS