Human Generated Data

Title

Untitled (Mask and Wig members drinking water from a pitcher)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8224

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Mask and Wig members drinking water from a pitcher)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8224

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 97.1
Human 97.1
Person 96
Person 95.7
Person 95.4
Sunglasses 94.2
Accessories 94.2
Accessory 94.2
Clothing 93.1
Apparel 93.1
Person 92.8
Sunglasses 91.8
Female 88.1
Face 83.3
People 71.5
Blonde 70.4
Teen 70.4
Kid 70.4
Child 70.4
Girl 70.4
Woman 70.4
Sunglasses 66.7
Portrait 66.4
Photography 66.4
Photo 66.4
Poster 57.2
Advertisement 57.2

Clarifai
created on 2023-10-25

people 99.9
group 99.4
adult 98.3
group together 98.2
recreation 97.2
child 97.1
woman 97.1
many 96.4
man 96.1
music 95.2
vehicle 94.1
adolescent 92.9
watercraft 91.2
several 89
wear 88.5
education 87
transportation system 85.4
musician 83.5
leader 83.2
administration 83.1

Imagga
created on 2022-01-08

television 90.8
broadcasting 46.8
telecommunication 35.1
telecommunication system 26.1
medium 22.7
monitor 21.4
computer 20.8
laptop 20.6
billboard 20.5
people 18.4
man 16.1
signboard 15.8
male 15.6
technology 15.6
screen 14.6
business 14.6
equipment 13.5
working 12.4
adult 11.6
car 11.4
black 11.4
office 11.2
structure 10.8
person 10.6
happy 10
modern 9.8
old 9
outdoors 8.9
together 8.8
smiling 8.7
love 8.7
lifestyle 8.7
attractive 8.4
digital 8.1
group 8.1
art 8
women 7.9
electronic equipment 7.8
work 7.8
smile 7.8
travel 7.7
notebook 7.7
sport 7.6
studio 7.6
keyboard 7.5
one 7.5
vintage 7.4
retro 7.4
display 7.1
sculpture 7

Microsoft
created on 2022-01-08

text 99.7
person 94.4
bottle 82.8
drawing 77.3
black and white 73.1
soft drink 56.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 79.9%
Calm 97.7%
Sad 1.8%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 36-44
Gender Female, 58.4%
Calm 73.7%
Surprised 9.1%
Sad 8.6%
Disgusted 2.6%
Angry 2.2%
Confused 1.8%
Fear 1.4%
Happy 0.7%

AWS Rekognition

Age 14-22
Gender Male, 72.9%
Happy 47.2%
Calm 40%
Surprised 4.7%
Angry 2.1%
Sad 2.1%
Confused 1.6%
Fear 1.4%
Disgusted 1%

AWS Rekognition

Age 39-47
Gender Female, 93.6%
Calm 99.6%
Happy 0.3%
Surprised 0.1%
Angry 0%
Disgusted 0%
Sad 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 90.8%
Calm 93.6%
Sad 6%
Angry 0.1%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

Feature analysis

Amazon

Person 97.1%
Sunglasses 94.2%
Poster 57.2%

Text analysis

Amazon

7620

Google

口O An00 7620 2620 7620
O
An00
7620
2620