Human Generated Data

Title

Untitled (two women and two men standing outside of circus train)

Date

1948

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5360

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and two men standing outside of circus train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5360

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.4
Clothing 99.3
Apparel 99.3
Person 99.2
Person 99.1
Face 91.8
Female 91.8
People 91.4
Door 81
Woman 76.4
Dress 75.9
Girl 74.4
Family 73.4
Photography 67.6
Photo 67.6
Sleeve 67
Kid 66.9
Child 66.9
Portrait 66.5
Suit 63.7
Coat 63.7
Overcoat 63.7
Shorts 59.4
Pants 59.1
Man 56.3
Teen 56.3
Shirt 56.2

Clarifai
created on 2023-10-26

people 99.8
group 97.9
man 97.1
adult 96.7
group together 96.7
wear 96
three 95.6
woman 95
two 91.3
four 90.1
several 87.5
monochrome 87.3
nostalgia 86.4
child 83.7
retro 83.3
outfit 82.7
family 81.7
sibling 80.2
nostalgic 80
aircraft 78.1

Imagga
created on 2022-01-23

person 34.1
clothing 27.1
people 24.5
robe 22.4
male 20.8
garment 20.7
man 20.2
adult 19.9
portrait 18.8
planner 17.2
business 15.2
standing 14.8
smile 13.5
men 12.9
fashion 12.8
posing 12.4
smiling 12.3
lady 12.2
dress 11.7
model 11.7
face 11.4
happy 11.3
work 11.3
clothes 11.2
attractive 11.2
professional 11.1
covering 10.7
full length 10.7
interior 10.6
human 10.5
pretty 10.5
brunette 10.5
health 10.4
style 10.4
newspaper 10.3
job 9.7
businessman 9.7
body 9.6
home 9.6
worker 9.5
corporate 9.4
consumer goods 9.4
sport 9.3
pose 9.1
one 9
healthy 8.8
hair 8.7
women 8.7
lifestyle 8.7
life 8.7
elegance 8.4
suit 8.3
product 8.3
occupation 8.2
child 8.2
active 8.1
sexy 8
looking 8
art 7.9
day 7.8
wall 7.7
old 7.7
profession 7.7
black 7.4
family 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.7
clothing 97.8
person 97.6
outdoor 97.6
holding 95.8
grass 95.7
posing 93
standing 86.5
footwear 85.2
smile 71.5
black and white 55.9
man 55

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.2%
Calm 37.6%
Confused 22.7%
Sad 13.9%
Happy 10.3%
Angry 7.4%
Disgusted 4.7%
Fear 2.1%
Surprised 1.3%

AWS Rekognition

Age 49-57
Gender Male, 92.1%
Calm 80.3%
Sad 13.7%
Happy 2.6%
Surprised 2.1%
Fear 0.9%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 26-36
Gender Female, 98.4%
Happy 97%
Sad 1.2%
Angry 0.7%
Fear 0.4%
Calm 0.2%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 26-36
Gender Female, 63.5%
Calm 35.3%
Surprised 33.2%
Happy 8.8%
Sad 7.7%
Disgusted 4.7%
Confused 4.7%
Angry 4.4%
Fear 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

24032A

Google

24032 A
24032
A