Human Generated Data

Title

Untitled (woman with roses on stage behind podium)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8852

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman with roses on stage behind podium)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8852

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 99.2
Person 99.2
Person 99.1
Clothing 98.7
Apparel 98.7
Person 98.7
Person 98.3
Plant 98.2
Person 97.9
Person 97.4
Flower 94.9
Blossom 94.9
Flower Arrangement 94.5
Person 93.8
Flower Bouquet 93.5
Suit 92.1
Coat 92.1
Overcoat 92.1
Person 91
Person 88.7
Interior Design 86.8
Indoors 86.8
Funeral 74.3
Wedding 71.2
Gown 69.7
Fashion 69.7
People 69.3
Crowd 68.7
Lighting 68.3
Person 66.4
Robe 60.6
Dress 59.4
Tuxedo 58.4

Clarifai
created on 2023-10-26

people 99.7
child 97.5
group 96.7
music 96.4
musician 96
indoors 94.6
ceremony 94.1
actress 92.4
man 91.7
singer 91.7
many 91.7
adult 91.5
woman 88.6
sit 88.4
administration 88.3
leader 85.1
audience 84.8
piano 84.1
wedding 83.9
wear 82.9

Imagga
created on 2022-01-15

architecture 21.2
building 19.8
afghan hound 17.8
travel 15.5
window 14.7
wedding 14.7
house 14.3
hound 14.2
structure 14.1
outfit 13.6
city 13.3
people 12.8
statue 12.3
sky 12.1
table 11.5
bride 11.5
holiday 11.4
water 11.3
flowers 11.3
famous 11.1
old 11.1
hunting dog 11.1
sculpture 10.8
tourism 10.7
couple 10.4
marriage 10.4
chair 10.4
celebration 10.4
balcony 10.2
art 9.7
home 9.6
luxury 9.4
fountain 9.2
dress 9
landscape 8.9
scenic 8.8
urban 8.7
women 8.7
groom 8.7
love 8.7
scene 8.6
estate 8.5
two 8.5
vacation 8.2
restaurant 8.2
landmark 8.1
new 8.1
religion 8.1
male 7.8
column 7.7
party 7.7
elegant 7.7
wall 7.7
flower 7.7
windows 7.7
husband 7.6
wife 7.6
relax 7.6
human 7.5
monument 7.5
man 7.4
negative 7.3
historic 7.3
adult 7.3
tourist 7.2
dog 7.2
night 7.1
happiness 7

Google
created on 2022-01-15

Photograph 94.2
Plant 91.5
Black 89.7
Black-and-white 86.5
Style 84
Monochrome 77.2
Monochrome photography 77.2
Snapshot 74.3
Font 74.2
Suit 73.9
Event 72.8
Room 71.7
Art 68.9
Stock photography 67.1
Flower Arranging 66.5
Curtain 65.6
Floristry 61.2
Picture frame 60.8
Gown 59.9
Child 59.3

Microsoft
created on 2022-01-15

person 95.7
text 93.6
indoor 90.4
group 71.8
clothing 66.3
wedding dress 64.3
woman 59.1
people 55.3
black and white 52
posing 38.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 93.1%
Calm 94.5%
Sad 1.8%
Confused 1.4%
Fear 0.6%
Happy 0.6%
Disgusted 0.4%
Surprised 0.4%
Angry 0.3%

AWS Rekognition

Age 47-53
Gender Male, 94%
Calm 52.3%
Happy 46.2%
Angry 0.6%
Sad 0.3%
Confused 0.2%
Surprised 0.2%
Disgusted 0.2%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 54.5%
Happy 76%
Calm 16.2%
Sad 3.1%
Confused 1.7%
Angry 1%
Disgusted 0.9%
Fear 0.6%
Surprised 0.5%

AWS Rekognition

Age 23-33
Gender Female, 69.3%
Fear 69.9%
Calm 22%
Happy 5%
Sad 1.2%
Angry 1%
Confused 0.3%
Disgusted 0.3%
Surprised 0.3%

AWS Rekognition

Age 33-41
Gender Female, 59.4%
Calm 95.1%
Happy 3.3%
Sad 0.4%
Surprised 0.4%
Angry 0.3%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 87.6%
Happy 96.1%
Sad 1.2%
Surprised 0.8%
Angry 0.6%
Fear 0.6%
Calm 0.4%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 24-34
Gender Male, 89.1%
Calm 96.6%
Sad 2%
Happy 0.5%
Surprised 0.2%
Confused 0.2%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Text analysis

Amazon

39533.
#2.=
MJ17--YT7--

Google

39533. =,乙林
39533.
=
,