Human Generated Data

Title

Untitled (man and woman standing next to harp player at wedding)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8335

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman standing next to harp player at wedding)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8335

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.8
Apparel 99.8
Person 98.4
Human 98.4
Overcoat 97.3
Coat 97.3
Person 96.9
Tie 95.6
Accessories 95.6
Accessory 95.6
Sunglasses 95.6
Person 84.5
Person 84.1
Suit 82.7
Tuxedo 82.2
Person 82
Text 67
Shirt 66.9
Person 65.2
Hat 62.3
Smoke 62.2
Military 59.1
Military Uniform 59.1
Home Decor 57
Sun Hat 56.3
Person 50.7

Clarifai
created on 2023-10-25

people 99.7
group 98.3
man 98
adult 97.2
wear 96.2
monochrome 93
two 92.6
coat 90.8
portrait 90.2
eyewear 89.5
group together 88.9
outerwear 88.6
veil 87.5
music 87.1
three 85.7
outfit 84.3
woman 81.7
medicine 81.1
four 79.6
five 79

Imagga
created on 2022-01-09

man 32.2
person 25.8
people 22.8
male 22.7
work 18.8
adult 16.2
black 16.2
old 16
men 15.4
job 14.1
building 13.8
business 13.3
world 12.8
architecture 12.5
mask 12.2
engineer 12.2
group 12.1
construction 12
professional 11.9
worker 11.8
city 11.6
businessman 11.5
serious 11.4
standing 10.4
portrait 10.3
clothing 9.4
industry 9.4
occupation 9.2
tourism 9.1
equipment 8.6
builder 8.5
human 8.2
industrial 8.2
working 7.9
coat 7.9
suit 7.9
film 7.8
manager 7.4
safety 7.4
brass 7.3
protection 7.3
danger 7.3
dirty 7.2
negative 7.2
looking 7.2
covering 7.2
women 7.1
face 7.1
helmet 7
white 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.9
person 95.9
clothing 95.4
man 91.7
black and white 87.7
human face 79.9
drawing 71.9
glasses 56.7
sunglasses 55.1
monochrome 52.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Male, 99.7%
Calm 51.1%
Sad 27.3%
Disgusted 10.9%
Confused 3.3%
Surprised 3%
Angry 2.6%
Fear 1%
Happy 0.7%

Feature analysis

Amazon

Person 98.4%
Tie 95.6%
Sunglasses 95.6%
Suit 82.7%

Categories

Text analysis

Amazon

110.79
ISO
LLC79

Google

L079
OSI
HAGOX-
L079 OSI HAGOX- YT33A2-NAMTZAR
YT33A2-NAMTZAR