Human Generated Data

Title

Untitled (woman tying bottle to bow of boat, Mantalocking, NJ)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8503

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman tying bottle to bow of boat, Mantalocking, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8503

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.9
Human 99.9
Clothing 99.9
Apparel 99.9
Person 99.5
Person 99.5
Person 99.4
Person 99.4
Person 99.2
Person 98.9
Person 98.3
Shorts 98.2
Female 98.2
Skirt 95.9
Person 95.2
Woman 93.1
Person 77.3
People 71.7
Dress 66.1
Girl 60.1
Military Uniform 56.4
Military 56.4

Clarifai
created on 2023-10-25

people 99.4
group together 97.5
man 97.2
group 95.8
woman 95.5
adult 93.8
child 91.3
leader 87.7
administration 86.9
music 86.8
many 85.6
monochrome 84.8
actor 81.9
recreation 81.5
adolescent 79.8
musician 77.1
indoors 76.9
education 71.7
audience 71.1
sitting 69.8

Imagga
created on 2022-01-09

washboard 36.8
musical instrument 35.2
device 34.7
percussion instrument 26.9
adult 23.6
male 22
people 21.7
steel drum 21.7
man 21.5
happy 15.7
couple 14.8
person 14.5
business 14
smiling 13.7
sitting 13.7
lifestyle 13.7
cheerful 13
men 12.9
laptop 12.7
outdoors 12.7
women 12.6
happiness 12.5
two 11.8
day 11.8
group 11.3
fun 11.2
computer 11.2
teacher 11.2
attractive 10.5
holding 9.9
job 9.7
businessman 9.7
table 9.6
boy 9.6
building 9.1
black 9
chair 9
technology 8.9
interior 8.8
working 8.8
work 8.8
musician 8.7
love 8.7
smile 8.5
friends 8.4
communication 8.4
fashion 8.3
indoor 8.2
stage 8.1
home 8
color 7.8
professional 7.7
summer 7.7
old 7.7
casual 7.6
leisure 7.5
child 7.4
vacation 7.4
water 7.3
worker 7.2
holiday 7.2
world 7.1
together 7

Google
created on 2022-01-09

Black 89.6
Human 89.2
Dress 88.3
Black-and-white 85.1
Style 83.9
Line 81.9
Adaptation 79.3
Motor vehicle 77.3
Monochrome photography 76.1
Vintage clothing 75.9
Monochrome 73.9
Event 72.8
Photo caption 71.8
Recreation 71.2
Suit 71
Font 68.9
Room 67.9
Advertising 66.3
Classic 65.2
Fun 64.7

Microsoft
created on 2022-01-09

person 99.8
clothing 93.3
text 87.7
man 77
black and white 58.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 80.9%
Calm 57.3%
Sad 26.5%
Happy 14.3%
Disgusted 0.6%
Confused 0.4%
Fear 0.4%
Angry 0.4%
Surprised 0.2%

AWS Rekognition

Age 51-59
Gender Male, 98.5%
Calm 97.4%
Sad 1%
Confused 0.4%
Disgusted 0.4%
Happy 0.3%
Angry 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 54-62
Gender Male, 61.4%
Calm 90.9%
Happy 4.1%
Sad 1.3%
Angry 1.1%
Confused 0.9%
Disgusted 0.8%
Surprised 0.6%
Fear 0.3%

AWS Rekognition

Age 48-56
Gender Male, 99.5%
Sad 35.9%
Calm 26.2%
Happy 24.2%
Disgusted 9.1%
Fear 1.4%
Surprised 1.2%
Angry 1.2%
Confused 0.6%

AWS Rekognition

Age 50-58
Gender Male, 80.2%
Calm 76.9%
Sad 7.7%
Happy 5.8%
Disgusted 2.9%
Confused 2.9%
Surprised 1.4%
Fear 1.3%
Angry 1%

AWS Rekognition

Age 42-50
Gender Female, 82.4%
Sad 54.3%
Calm 13.2%
Surprised 8.2%
Happy 7.7%
Confused 6.3%
Angry 5.4%
Fear 3.6%
Disgusted 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%

Categories

Text analysis

Amazon

17338
17338.
NAGON
NAGON илитала
илитала
B8

Google

17338· 173 38.
17338·
173
38.