Human Generated Data

Title

Untitled (two men and a woman seated below a mantle with flowers)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8469

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men and a woman seated below a mantle with flowers)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8469

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 100
Apparel 100
Person 99.3
Human 99.3
Dress 98.8
Suit 98.6
Overcoat 98.6
Coat 98.6
Person 98
Person 97.7
Chair 95.1
Furniture 95.1
Bridegroom 95.1
Wedding 95.1
Robe 94.8
Fashion 94.8
Gown 93.5
Female 92.6
Tie 92.6
Accessories 92.6
Accessory 92.6
Face 92.6
Bride 85
Wedding Gown 85
Plant 84.3
Tuxedo 81.5
Woman 81.2
Indoors 80.7
Shirt 74.9
Portrait 74.5
Photography 74.5
Photo 74.5
Flower 73.9
Blossom 73.9
Man 72.7
Table 70.6
Flower Arrangement 63.7
Dining Table 63.3
Room 59.8
Costume 59.7
Building 59.3
Girl 58.1

Clarifai
created on 2023-10-26

people 99.9
group 98.8
adult 97.5
group together 97.3
man 97.2
many 95.7
leader 94
wear 93.6
outfit 93.5
administration 92.4
wedding 91.2
woman 88.5
several 88.1
musician 85
music 84.2
ceremony 83.3
menswear 83.2
dinner jacket 80.7
monochrome 78.4
veil 78.2

Imagga
created on 2022-01-15

man 30.9
male 27.7
people 27.3
person 24
picket fence 20.6
adult 19.1
fence 16.4
men 14.6
professional 13.8
sport 13.4
businessman 13.2
boy 13
business 12.7
human 12.7
barrier 12.5
work 11.9
silhouette 11.6
standing 11.3
play 11.2
women 11.1
active 11
black 10.8
outdoor 10.7
musical instrument 10.6
nurse 10.6
player 10.5
group 10.5
ball 10.1
lifestyle 10.1
brass 9.9
worker 9.9
team 9.8
fun 9.7
portrait 9.7
equipment 9.6
sky 9.6
wind instrument 9.4
holding 9.1
health 9
game 8.9
child 8.8
happy 8.8
room 8.7
athlete 8.5
beach 8.4
summer 8.4
obstruction 8.3
leisure 8.3
outdoors 8.2
exercise 8.2
activity 8.1
success 8
job 8
life 7.8
couple 7.8
world 7.7
old 7.7
two 7.6
dark 7.5
teacher 7.5
teamwork 7.4
competition 7.3
dress 7.2
looking 7.2
idea 7.1
family 7.1
love 7.1
travel 7
mask 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.2
person 93.8
black and white 79
posing 38.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.7%
Calm 46%
Happy 22.7%
Sad 20.4%
Disgusted 5.5%
Confused 2.4%
Surprised 1.5%
Angry 0.8%
Fear 0.7%

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Happy 84.7%
Sad 7.8%
Surprised 4.3%
Calm 1.2%
Confused 0.7%
Fear 0.5%
Angry 0.4%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 92.6%

Text analysis

Amazon

14725
7
14725.

Google

14725. 14725. 7
14725.
7