Human Generated Data

Title

Untitled (woman with roses on stage behind podium)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8857

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman with roses on stage behind podium)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8857

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.5
Person 99.4
Person 99.3
Person 99.2
Person 99
Person 99
Plant 98.4
Person 98.1
Person 95.8
Person 94.8
Person 94.3
Flower 92.6
Blossom 92.6
Flower Arrangement 92.2
Crowd 88.9
Flower Bouquet 86.5
Interior Design 83.7
Indoors 83.7
Sitting 79.3
Vase 78.1
Jar 78.1
Pottery 78.1
Audience 77.4
People 68.9
Chair 67.6
Furniture 67.6
Potted Plant 57.8
Musician 57.6
Musical Instrument 57.6

Clarifai
created on 2023-10-26

people 99.8
child 98.8
group 97.8
group together 94.7
woman 94
man 93.5
adult 93
boy 91.7
many 90.8
ceremony 90.5
wedding 89.1
bench 88.4
music 88.2
recreation 87.5
musician 85.6
indoors 84.5
sit 83.7
several 81.9
groom 80.1
three 78.5

Imagga
created on 2022-01-15

afghan hound 36.6
hound 29.4
hunting dog 22.8
travel 15.5
dog 14.7
house 14.2
building 14.1
sky 13.4
old 13.2
landscape 12.6
structure 11.8
water 11.3
scene 11.2
architecture 10.9
window 10.3
people 10
holiday 10
religion 9.8
sea 9.4
ocean 9.1
tourism 9.1
transportation 9
river 8.9
crowd 8.6
bridge 8.5
stage 8.4
famous 8.4
wood 8.3
city 8.3
island 8.2
home 8
scenic 7.9
male 7.8
domestic animal 7.6
room 7.5
canine 7.5
church 7.4
man 7.4
historic 7.3
transport 7.3
group 7.2
landmark 7.2
coast 7.2
shop 7.2
life 7.1
rural 7
sculpture 7

Google
created on 2022-01-15

Photograph 94.2
Plant 90.7
Black 89.8
Black-and-white 87.2
Style 84
Hat 81.4
Houseplant 79.3
Monochrome photography 77.3
Font 76.9
Monochrome 76.4
Snapshot 74.3
Art 73.6
Event 72.7
Room 70.1
Suit 65.9
Stock photography 64.6
Curtain 64
Flowerpot 62.8
Picture frame 61
Photo caption 59.3

Microsoft
created on 2022-01-15

person 96
text 95.2
indoor 87.2
group 71.2
black and white 65.8
clothing 60.6
people 60.5
human face 52.4
posing 41.8
old 41

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 77.8%
Sad 72.9%
Calm 24.5%
Confused 1.6%
Happy 0.3%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Female, 85.8%
Calm 74.7%
Happy 14.6%
Sad 3.8%
Angry 2.9%
Confused 1.2%
Surprised 1.1%
Fear 1%
Disgusted 0.8%

AWS Rekognition

Age 43-51
Gender Female, 76.3%
Calm 79%
Happy 10.2%
Surprised 5.4%
Sad 2.9%
Confused 0.7%
Angry 0.6%
Disgusted 0.6%
Fear 0.6%

AWS Rekognition

Age 42-50
Gender Female, 75.7%
Calm 85.9%
Happy 8.4%
Sad 2.8%
Surprised 1.2%
Confused 0.5%
Angry 0.5%
Fear 0.3%
Disgusted 0.3%

AWS Rekognition

Age 33-41
Gender Male, 91.5%
Calm 77.4%
Happy 15.2%
Sad 3.7%
Fear 1.2%
Surprised 1%
Angry 0.6%
Confused 0.6%
Disgusted 0.3%

AWS Rekognition

Age 49-57
Gender Male, 92.8%
Calm 84%
Happy 8.3%
Sad 3.8%
Angry 2.3%
Surprised 0.5%
Fear 0.5%
Disgusted 0.3%
Confused 0.3%

AWS Rekognition

Age 31-41
Gender Female, 51.3%
Happy 70.9%
Calm 17.7%
Sad 5.6%
Fear 2%
Surprised 1.6%
Confused 0.9%
Disgusted 0.6%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Text analysis

Amazon

395
light
bounce
395 34A unth bounce light
34A unth