Human Generated Data

Title

Untitled (outdoor wedding reception)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8315

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (outdoor wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8315

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.1
Person 99.1
Person 97.3
Person 96.9
Person 95.4
Person 94.9
Person 91.8
Crowd 90.8
Person 89.8
Audience 89.5
Person 85.3
Person 82.5
Clothing 81.9
Apparel 81.9
People 78.8
Person 78
Person 73.3
Person 71.6
Female 71.4
Person 66.9
Person 66.1
Housing 65.7
Building 65.7
Person 63.7
Drawing 61.8
Art 61.8
Girl 59.1
School 57.8
Person 57
Furniture 56
Chair 56
Woman 55.9

Clarifai
created on 2023-10-26

people 99.9
child 99.3
group 98.8
many 98.4
education 98.2
elementary school 97.9
school 97.3
adult 96.9
group together 96.7
man 96.2
woman 95.5
war 92.4
boy 92.4
administration 91.5
teacher 88.3
monochrome 85.4
crowd 83.4
street 80.5
classroom 79.5
home 78.5

Imagga
created on 2022-01-08

kin 36.5
people 29.5
building 24.7
city 24.1
man 22.2
school 19.5
men 18.9
group 16.1
business 15.8
travel 15.5
person 15
urban 14.8
world 14.8
street 13.8
male 13.5
portrait 12.3
women 11.9
team 11.6
adult 11.6
walking 11.4
happy 11.3
structure 10.8
life 10.7
together 10.5
scene 10.4
musician 10
tourist 10
singer 10
tourism 9.9
old 9.7
outdoors 9.7
couple 9.6
sitting 9.4
day 9.4
smiling 9.4
youth 9.4
child 9.3
classroom 9.2
boy 8.7
crowd 8.6
architecture 8.6
walk 8.6
smile 8.5
two 8.5
black 8.4
house 8.4
teamwork 8.3
room 8.2
transportation 8.1
family 8
sport 8
businessman 7.9
office 7.9
station 7.8
gymnasium 7.8
corporate 7.7
mother 7.7
train 7.7
silhouette 7.4
window 7.4
park 7.4
uniform 7.4
passenger 7.4
children 7.3
performer 7.3
square 7.2
road 7.2
home 7.2

Google
created on 2022-01-08

Building 95.3
Window 91.4
Black 89.5
House 86.4
Black-and-white 84.8
Chair 84.6
Style 83.8
Musical instrument 76.8
Crowd 76.1
Monochrome photography 75.7
Monochrome 75.5
Snapshot 74.3
Event 73.9
Vintage clothing 71.7
Room 70.4
Photo caption 69.9
Tree 69.5
Art 68.8
Pole 68.7
History 66.1

Microsoft
created on 2022-01-08

text 98.3
outdoor 90.9
person 87.5
people 81.1
house 68.8
black and white 67.3
cemetery 63.9
grave 61.1
funeral 57.9
clothing 54.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.6%
Calm 89.1%
Happy 8.9%
Sad 0.8%
Angry 0.4%
Fear 0.3%
Disgusted 0.3%
Surprised 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.1%
Chair 56%

Categories

Text analysis

Amazon

10085
S
M 117
M 117 YE3
YE3

Google

10085. 10085. 100 85.
10085.
100
85.