Human Generated Data

Title

Untitled (group of adults posing on dock)

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10592

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of adults posing on dock)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10592

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.8
Person 99.8
Person 99.7
Person 99.7
Person 99.5
Person 99.5
Shorts 99.5
Clothing 99.5
Apparel 99.5
Person 99.1
Person 98.8
Person 85
Stage 82.5
Female 81.5
People 74.9
Crowd 65.2
Woman 64.8
Brick 59.9
Outdoors 59.3
Skirt 56.4
Photography 56
Photo 56
Sleeve 55.5

Clarifai
created on 2023-10-25

people 99.6
group together 99.1
music 98.3
many 97.3
adult 94.9
wear 93.9
musician 93.5
woman 93.2
singer 93.1
man 92.5
group 92.3
victory 90.3
band 86.8
administration 85.7
stage 85.4
recreation 80.9
several 80.1
athlete 79.4
concert 78.7
five 78.1

Imagga
created on 2022-01-09

stage 69.6
platform 50.1
people 27.9
musical instrument 22.9
male 19.9
person 19.5
man 18.8
musician 16
wind instrument 15.9
couple 15.7
adult 15.2
silhouette 14.9
model 14.8
men 13.7
performer 13.7
dress 13.5
women 13.4
love 13.4
happiness 13.3
portrait 12.9
black 12.6
leisure 12.4
singer 12.3
group 12.1
dark 11.7
night 11.5
bride 11.5
fashion 11.3
lifestyle 10.8
happy 10.6
dance 10.5
human 9.7
lady 9.7
style 9.6
body 9.6
youth 9.4
two 9.3
elegance 9.2
music 9
percussion instrument 9
sunset 9
clothing 8.8
together 8.8
outdoor 8.4
pretty 8.4
attractive 8.4
wedding 8.3
fun 8.2
outfit 8.2
sexy 8
water 8
posing 8
celebration 8
business 7.9
band 7.8
bouquet 7.5
accordion 7.5
one 7.5
holiday 7.2
romantic 7.1
summer 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.4
clothing 97.4
person 97
footwear 90.1
man 90
black and white 85.8
white 63.5
posing 38.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 93.5%
Calm 99.8%
Sad 0.1%
Surprised 0.1%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Female, 77.7%
Calm 54.4%
Happy 21.8%
Sad 16.8%
Confused 3.9%
Angry 1.6%
Disgusted 0.5%
Surprised 0.5%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Male, 97.8%
Sad 93.1%
Calm 3.1%
Happy 2.6%
Disgusted 0.5%
Confused 0.2%
Fear 0.2%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 31-41
Gender Female, 99.3%
Happy 81.6%
Calm 17.6%
Sad 0.4%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Female, 87.7%
Calm 83.9%
Happy 14.1%
Sad 0.7%
Surprised 0.4%
Fear 0.3%
Confused 0.3%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 42-50
Gender Male, 59.6%
Calm 99.7%
Happy 0.1%
Surprised 0.1%
Sad 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Happy 60.3%
Calm 30.9%
Fear 4.9%
Sad 1.2%
Surprised 1%
Disgusted 0.7%
Confused 0.6%
Angry 0.3%

AWS Rekognition

Age 30-40
Gender Male, 98.8%
Calm 99.7%
Surprised 0.1%
Happy 0.1%
Disgusted 0%
Sad 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 98.7%
Confused 55%
Calm 28.1%
Happy 7%
Sad 4.8%
Surprised 1.9%
Fear 1.3%
Disgusted 1.2%
Angry 0.8%

AWS Rekognition

Age 39-47
Gender Female, 95.7%
Calm 70.5%
Happy 26%
Sad 1.5%
Surprised 0.8%
Disgusted 0.4%
Confused 0.3%
Fear 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Amazon

26439-F
NAGO

Google

26439-F
26439-F