Human Generated Data

Title

Untitled (couples sitting on a ledge under a tent)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8528

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couples sitting on a ledge under a tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8528

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.9
Apparel 99.9
Person 99.7
Human 99.7
Person 99.3
Person 98.7
Person 98.7
Sitting 91.5
Shoe 85.6
Footwear 85.6
Furniture 80.9
Suit 80.6
Overcoat 80.6
Coat 80.6
Shoe 80
Shoe 79.2
Tie 75.2
Accessories 75.2
Accessory 75.2
Leisure Activities 74.2
Fashion 63.8
Robe 63.8
Dating 63
Wedding 57.4
Wedding Gown 57.3
Gown 57.3
Female 56.6
Chair 56.4
Shoe 50.2

Clarifai
created on 2023-10-25

people 99.7
group 98.5
man 96.8
group together 96.5
woman 95.1
music 93.4
adult 93
chair 88.9
monochrome 88.3
several 88.2
dancer 88
leader 87.9
dancing 87.5
many 85.1
actor 85
musician 84.1
singer 84
wear 83.1
actress 81.8
indoors 81.5

Imagga
created on 2022-01-09

person 36.4
man 31.6
people 30.7
male 30.5
stage 30.4
silhouette 28.1
wind instrument 26.6
musician 25.9
businessman 23.8
adult 23.6
musical instrument 23.5
singer 23.1
platform 21.4
business 21.3
professional 20.9
men 20.6
horn 20.2
performer 20
device 18.9
player 18.5
brass 18.1
sax 16.4
black 15.6
event 14.8
job 14.2
teacher 13.8
music 13.5
youth 12.8
crowd 12.5
employee 12.4
boss 12.4
instrumentality 12.3
cornet 12.2
group 12.1
lights 12.1
corporate 12
training 12
glowing 12
competition 11.9
executive 11.8
suit 11.8
nighttime 11.7
audience 11.7
sport 11.5
muscular 11.5
vibrant 11.4
design 11.3
office 11.2
shiny 11.1
entertainer 11
flag 11
cheering 10.8
symbol 10.8
park 10.7
stadium 10.7
handsome 10.7
match 10.6
skill 10.6
patriotic 10.5
success 10.5
boy 10.4
nation 10.4
looking 10.4
guitar 10.3
manager 10.2
indoor 10
sunset 9.9
backhand 9.9
hand 9.9
star 9.9
versus 9.9
racket 9.8
attractive 9.8
shorts 9.8
serve 9.8
tennis 9.7
court 9.7
championship 9.7
style 9.6
athlete 9.6
room 9.6
musical 9.6
icon 9.5
women 9.5
bright 9.3
communication 9.2
field 9.2
modern 9.1
one 9
educator 9
happy 8.8
together 8.8
couple 8.7
rock 8.7
lifestyle 8.7
work 8.6
performance 8.6
chair 8.5
meeting 8.5
two 8.5
accordion 8.5
instrument 8.4
sky 8.3
holding 8.3
artifact 8.2
relax 7.6
dark 7.5
human 7.5
sound 7.5
leisure 7.5
bass 7.3
alone 7.3
laptop 7.3
new 7.3
figure 7.2
night 7.1
love 7.1
interior 7.1
happiness 7.1

Google
created on 2022-01-09

Footwear 98.3
Shoe 95.1
Black-and-white 84.1
Style 84
Font 82.6
Adaptation 79.4
Flash photography 78.7
Suit 78.6
Tints and shades 76.8
Art 75.7
Event 73.1
Vintage clothing 72.8
Monochrome photography 71.1
Music 70.7
Monochrome 70.2
Sitting 68.3
Boot 67.3
Room 66.9
Tree 65.9
Photo caption 64.7

Microsoft
created on 2022-01-09

person 96.9
clothing 91.8
text 87.8
man 81.7
footwear 75.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 65.5%
Calm 90.3%
Happy 7.6%
Surprised 0.9%
Sad 0.4%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%
Confused 0.2%

AWS Rekognition

Age 37-45
Gender Male, 76.2%
Happy 58.2%
Sad 18.2%
Calm 10%
Confused 5.4%
Surprised 2.4%
Angry 2.4%
Disgusted 1.8%
Fear 1.5%

AWS Rekognition

Age 47-53
Gender Female, 97.1%
Calm 93.6%
Sad 3.8%
Surprised 0.9%
Fear 0.6%
Confused 0.5%
Disgusted 0.3%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 45-53
Gender Male, 98.9%
Happy 44%
Calm 23.5%
Sad 15.2%
Angry 4.1%
Fear 3.9%
Surprised 3.6%
Confused 2.9%
Disgusted 2.9%

AWS Rekognition

Age 40-48
Gender Male, 92.1%
Calm 75%
Happy 6.2%
Confused 5.4%
Surprised 5.3%
Fear 2.6%
Sad 2.4%
Angry 2.1%
Disgusted 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 85.6%
Tie 75.2%

Categories

Text analysis

Amazon

17559.
17559

Google

17559.
-VAMT
17559. -VAMT