Human Generated Data

Title

Untitled (two women and a man at wedding reception)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8576

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a man at wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8576

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.9
Apparel 99.9
Person 99
Human 99
Person 99
Person 98.9
Person 98.3
Person 95.2
Hat 93.7
Accessories 90.9
Tie 90.9
Accessory 90.9
Sunglasses 86.1
Home Decor 79.5
Face 78.5
Coat 72.9
Suit 72.9
Overcoat 72.9
Person 71
Sun Hat 65.3
People 64.8
Dress 63.1
Photography 62.5
Photo 62.5
Portrait 62.2
Texture 56.4

Clarifai
created on 2023-10-25

people 99.8
man 97.8
group 97.1
adult 96.2
group together 94
wear 93.1
monochrome 93
woman 92.4
wedding 90.9
veil 90.9
three 89.9
street 89.8
two 85.7
four 84
elderly 81.7
music 80.5
child 80.5
recreation 79.5
gown (clothing) 79
dress 78.9

Imagga
created on 2022-01-09

man 48.4
male 41.8
person 39
people 36.8
adult 29.2
businessman 25.6
business 24.3
senior 23.4
professional 21
meeting 20.7
teacher 20.4
office 20.3
happy 20
men 18.9
mature 18.6
couple 18.3
executive 18.1
sitting 17.2
indoors 16.7
smiling 16.6
corporate 16.3
portrait 16.2
old 15.3
home 15.2
businesspeople 14.2
working 14.1
team 13.4
lifestyle 13
table 13
group 12.9
occupation 12.8
colleagues 12.6
entrepreneur 12.6
patient 12.6
handsome 12.5
elderly 12.4
together 12.3
businesswoman 11.8
retired 11.6
job 11.5
medical 11.5
smile 11.4
cheerful 11.4
education 11.3
looking 11.2
happiness 11
indoor 11
baron 10.8
educator 10.6
room 10.5
tie 10.4
desk 10.4
doctor 10.3
casual 10.2
20s 10.1
face 9.9
white 9.9
jacket 9.9
suit 9.8
to 9.7
student 9.6
women 9.5
career 9.5
teamwork 9.3
laptop 9.1
life 9
health 9
black 9
interior 8.8
discussing 8.8
work 8.8
40s 8.8
discussion 8.8
standing 8.7
hospital 8.6
horizontal 8.4
hand 8.4
camera 8.3
worker 8.3
holding 8.3
care 8.2
aged 8.1
employee 8.1
family 8
love 7.9
associates 7.9
coworkers 7.9
hands 7.8
husband 7.7
corporation 7.7
busy 7.7
planning 7.7
grandfather 7.7
30s 7.7
retirement 7.7
four 7.7
profession 7.7
two 7.6
adults 7.6
speaker 7.5
coffee 7.4
classroom 7.4
confident 7.3
color 7.2
computer 7.2
clothing 7.2
restaurant 7

Google
created on 2022-01-09

Human 89.9
Black 89.5
Hat 87.2
Sleeve 87.1
Black-and-white 86
Gesture 85.3
Style 84
Adaptation 79.3
Monochrome photography 75.6
Snapshot 74.3
Monochrome 74
Font 73.3
Event 73
Vintage clothing 70.8
Sun hat 69.4
Art 69.2
Pattern 67.3
Classic 67
Suit 66.3
Photo caption 66.2

Microsoft
created on 2022-01-09

person 99.3
clothing 96.2
text 96.1
man 91.6
black and white 72.3
human face 53.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 50.2%
Calm 78.1%
Sad 19.6%
Confused 0.6%
Angry 0.5%
Disgusted 0.4%
Happy 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 34-42
Gender Female, 76.9%
Happy 92.6%
Sad 4.4%
Calm 0.8%
Surprised 0.7%
Angry 0.4%
Disgusted 0.4%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 52-60
Gender Male, 98.6%
Calm 91%
Angry 3.2%
Confused 2.4%
Sad 0.9%
Fear 0.8%
Happy 0.8%
Surprised 0.7%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Hat 93.7%
Tie 90.9%
Sunglasses 86.1%

Text analysis

Amazon

12
17758.
85LLI

Google

17758. 58 ררו 12
17758.
ררו
58
12