Human Generated Data

Title

Untitled (young men and women standing on raised platform)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7150

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young men and women standing on raised platform)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7150

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.8
Human 99.8
Person 99.5
Person 99.4
Person 99.4
Person 98.8
Person 98.4
Person 97
Person 95
Person 86.4
Person 85.7
Bed 71.8
Furniture 71.8
Person 63.9
Portrait 63.3
Face 63.3
Photography 63.3
Photo 63.3
People 61.8
Text 61.2
Word 57.2
Overcoat 56.6
Suit 56.6
Coat 56.6
Clothing 56.6
Apparel 56.6
Chair 55.5

Clarifai
created on 2023-10-15

people 99.5
man 98.4
group together 97.5
adult 97.1
woman 97
group 97
many 96.4
child 95.3
monochrome 93.4
ladder 93.1
outdoors 89
fun 88.8
enjoyment 84.9
climb 84.5
recreation 84.4
togetherness 82.9
adolescent 82.7
four 80.9
construction worker 80.3
three 79.6

Imagga
created on 2021-12-15

device 36.5
loom 33.7
pier 29.5
textile machine 27.8
sky 26.8
support 26.8
architecture 26.6
chair 23.4
machine 22.3
building 20.9
construction 18
structure 16.2
water 16
steel 15.3
tower 14.4
old 13.2
metal 12.9
industry 12.8
industrial 12.7
travel 12.7
landmark 12.6
silhouette 12.4
seat 11.6
tourism 11.5
urban 11.3
famous 11.2
historic 11
sea 10.9
city 10.8
bridge 10.5
iron 10.3
ocean 9.9
furniture 9.9
work 9.7
wood 9.2
menorah 9
people 8.9
river 8.9
gate 8.8
wooden 8.8
man 8.7
high 8.7
park 8.5
equipment 8.5
outdoor 8.4
power 8.4
tourist 8.4
outdoors 8.2
landscape 8.2
black 7.8
skyline 7.6
vintage 7.4
window 7.3
candelabrum 7.2
history 7.1

Google
created on 2021-12-15

Outerwear 95.2
Photograph 94.1
White 92.2
Black 89.8
Standing 86.4
Style 83.9
Black-and-white 82.2
Font 75.6
Monochrome 75.4
Monochrome photography 71.9
Vintage clothing 69.1
Crew 67.9
Hat 67.9
Room 64.8
Stock photography 64.1
History 61.3
Illustration 58.6
Uniform 58
Machine 57.2
Sitting 52.5

Microsoft
created on 2021-12-15

text 90.5
person 90.4
clothing 79.8
black and white 78.9
people 63.8
posing 40.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-56
Gender Male, 50.6%
Sad 94.1%
Calm 4.2%
Happy 0.9%
Confused 0.3%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 26-40
Gender Female, 67%
Calm 83.2%
Happy 14.4%
Sad 1.4%
Confused 0.3%
Surprised 0.3%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 30-46
Gender Female, 67.1%
Calm 62.8%
Sad 18.3%
Happy 9.7%
Angry 8.1%
Confused 0.5%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 22-34
Gender Male, 65.2%
Calm 42.1%
Sad 36.2%
Happy 19%
Surprised 0.8%
Angry 0.8%
Confused 0.5%
Fear 0.5%
Disgusted 0.1%

AWS Rekognition

Age 26-42
Gender Male, 79.2%
Calm 84.9%
Happy 6.8%
Sad 6.4%
Surprised 1.1%
Angry 0.3%
Confused 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 34-50
Gender Female, 63.5%
Sad 52.7%
Angry 22.2%
Calm 16.9%
Happy 2.9%
Confused 2.7%
Fear 1.1%
Surprised 1%
Disgusted 0.5%

AWS Rekognition

Age 22-34
Gender Male, 94.4%
Calm 56.4%
Sad 27.1%
Happy 6.5%
Surprised 3.4%
Fear 2.9%
Confused 1.7%
Angry 1.5%
Disgusted 0.6%

AWS Rekognition

Age 28-44
Gender Male, 83.3%
Calm 74.4%
Surprised 12%
Confused 5.2%
Sad 4.1%
Happy 1.7%
Angry 1.7%
Fear 0.5%
Disgusted 0.5%

AWS Rekognition

Age 23-37
Gender Male, 94.2%
Calm 76.6%
Sad 18.2%
Happy 2.8%
Confused 1.1%
Angry 0.7%
Disgusted 0.2%
Surprised 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Text analysis

Amazon

17438
17438.
LACON

Google

17438. 174 38.
17438.
174
38.