Human Generated Data

Title

Untitled (four circus performers playing cards for money)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5214

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four circus performers playing cards for money)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99.2
Person 99.2
Person 97.2
Nature 96.7
Outdoors 94.9
Face 93.5
Apparel 84.1
Clothing 84.1
Rural 82.3
Shelter 82.3
Countryside 82.3
Building 82.3
Female 82.2
Smile 78
People 76.3
Icing 74.4
Cake 74.4
Creme 74.4
Dessert 74.4
Cream 74.4
Food 74.4
Girl 71.6
Kid 67.8
Child 67.8
Woman 65.7
Dress 65.4
Photography 64.4
Photo 64.4
Portrait 64.4
Snow 59.7
Sphere 57.6
Baby 57.3

Imagga
created on 2022-01-23

television 51.4
negative 27.7
people 25.1
telecommunication system 24.9
film 23.6
photographic paper 17.5
broadcasting 17
person 16.1
happy 15.6
technology 14.8
adult 14.3
male 14.2
couple 13.9
man 13.4
smile 12.8
business 12.7
portrait 12.3
telecommunication 12
love 11.8
photographic equipment 11.7
monitor 11.5
symbol 11.4
art 11.4
smiling 10.8
modern 10.5
men 10.3
women 10.3
happiness 10.2
lifestyle 10.1
dress 9.9
silhouette 9.9
party 9.4
equipment 9.3
world 9.2
computer 8.8
sitting 8.6
design 8.4
container 8.4
laptop 8.4
black 8.4
tray 8.4
fun 8.2
lady 8.1
screen 8.1
celebration 8
medium 8
copy 7.9
together 7.9
work 7.8
education 7.8
fashion 7.5
sign 7.5
human 7.5
group 7.2
sport 7.1
face 7.1
working 7.1

Microsoft
created on 2022-01-23

text 99.6
window 95.8
posing 94.3
drawing 87.6
cartoon 81.6
black and white 64.5
old 53.8

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 98%
Happy 60.6%
Calm 29%
Sad 5.3%
Surprised 2.3%
Confused 1.1%
Disgusted 0.7%
Angry 0.7%
Fear 0.4%

AWS Rekognition

Age 41-49
Gender Female, 85.4%
Happy 71.5%
Surprised 11.4%
Calm 9.1%
Sad 3.1%
Fear 1.5%
Disgusted 1.4%
Angry 1.1%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people posing for a photo 93.9%
a group of people posing for a photo in front of a window 87.9%
a group of people posing for the camera 87.8%

Text analysis

Amazon

A70A
16224
16224.
h2291
MJI3 ЭТАЯТIИ A70A h2291
MJI3
ЭТАЯТIИ

Google

16224.
168240 16224. MJIR 3TARTIM AR0Ahzz
168240
AR0Ahzz
3TARTIM
MJIR