Human Generated Data

Title

Untitled (five fishermen standing in front of fishing boat)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4899

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (five fishermen standing in front of fishing boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4899

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.2
Person 98.9
Person 98.9
Person 98.4
Clothing 97
Apparel 97
Face 71.7
Sailor Suit 71.5
Sleeve 70.8
Person 69.9
People 65.4
Coat 63.8
Portrait 62
Photography 62
Photo 62
Shorts 58.6
Suit 56.9
Overcoat 56.9

Clarifai
created on 2023-10-26

people 99.8
man 97.2
adult 97.1
group 96.8
woman 92.2
princess 90.8
wedding 90.8
wear 90.6
bride 88.7
ceremony 87.8
musician 87.8
dress 87.2
actor 85.5
leader 85.1
group together 84.7
portrait 83.7
veil 83.6
many 80.8
administration 78.7
music 78.6

Imagga
created on 2022-01-23

cemetery 13.4
celebration 12.8
water 12.7
decoration 12.3
business 12.1
clothing 12
white 11.9
art 11.8
people 11.7
bride 10.8
sign 10.5
symbol 10.1
ripple 9.5
holiday 9.3
old 9
dress 9
design 9
cold 8.6
glass 8.6
winter 8.5
adult 8.4
person 8.3
tourism 8.2
ice 8.1
religion 8.1
detail 8
flag 8
drawing 8
cool 8
black 7.8
travel 7.7
splash 7.7
two 7.6
hand 7.6
embroidery 7.5
technology 7.4
closeup 7.4
wedding 7.4
drop 7.2
paper 7.2
man 7.1
country 7
snow 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.7
clothing 81.7
wedding dress 80.4
person 79.9
posing 79.4
black and white 67.9
bride 66.2
old 55.3
dressed 26.9
clothes 25.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 99.8%
Calm 87.9%
Sad 7.2%
Confused 1.3%
Angry 1%
Happy 1%
Fear 0.6%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 50-58
Gender Male, 99.6%
Happy 38%
Calm 27.2%
Sad 17.3%
Confused 7.2%
Surprised 5.8%
Angry 2.5%
Disgusted 1.2%
Fear 0.8%

AWS Rekognition

Age 49-57
Gender Male, 95.4%
Sad 39.1%
Disgusted 21.4%
Happy 17.7%
Calm 7.6%
Confused 6.6%
Angry 4.3%
Surprised 1.7%
Fear 1.6%

AWS Rekognition

Age 51-59
Gender Male, 99%
Calm 47.7%
Sad 19.3%
Confused 12.9%
Fear 8.1%
Disgusted 4.6%
Happy 3%
Angry 2.6%
Surprised 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 99.4%

Text analysis

Amazon

CHADNICK.
17584. CHADNICK.
17584.

Google

17584. 11684.CHADNICK.
17584.
11684.CHADNICK.