Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.987

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.987

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Boy 99.6
Child 99.6
Male 99.6
Person 99.6
Male 99.4
Person 99.4
Adult 99.4
Man 99.4
Male 98.7
Person 98.7
Adult 98.7
Man 98.7
Person 93.1
Face 92.2
Head 92.2
Person 91.4
Baby 91.4
Photography 89.6
Portrait 89.6
Firearm 73.6
Weapon 73.6
Skin 66.7
Tattoo 66.1
Clothing 57.4
Pants 57.4
Gun 57.2
Handgun 57.2
People 56.6
Body Part 56.1
Finger 56.1
Hand 56.1
Accessories 56
Baseball 56
Baseball Glove 56
Glove 56
Sport 56
Blouse 55.9
Door 55.6
Shirt 55.5

Clarifai
created on 2018-05-11

people 99.9
child 99
group 98.5
two 98.2
adult 97.8
three 97.6
administration 96
offspring 94.9
man 94.7
facial expression 94.7
portrait 94.4
woman 93.9
four 93.6
wear 92.1
sibling 91.8
music 90
group together 89.9
boy 89.4
sit 89.3
family 89.1

Imagga
created on 2023-10-06

male 36.1
man 32.2
people 30.1
person 28.2
child 27.9
family 26.7
happy 25.1
home 23.9
couple 22.6
smiling 19.5
boy 19.1
lifestyle 18.8
smile 17.8
indoors 17.6
adult 17.6
love 17.4
two 16.9
kid 16.8
together 15.8
portrait 15.5
fun 15
happiness 14.9
mother 14.5
room 14.4
husband 14.3
grandfather 13.5
children 12.7
casual 12.7
retired 12.6
old 12.5
cheerful 12.2
senior 12.2
face 12.1
men 12
looking 12
youth 11.9
interior 11.5
cute 11.5
day 11
playing 10.9
teacher 10.5
wife 10.4
education 10.4
togetherness 10.4
school 10.4
play 10.3
sitting 10.3
father 10.1
blond 10.1
parent 10
leisure 10
black 9.7
clothing 9.6
elderly 9.6
talking 9.5
women 9.5
pair 9.4
indoor 9.1
business 9.1
holding 9.1
dress 9
human 9
daughter 8.9
classroom 8.9
to 8.8
little 8.8
two people 8.7
30s 8.7
retirement 8.6
desk 8.5
attractive 8.4
horizontal 8.4
mature 8.4
joy 8.4
care 8.2
aged 8.1
group 8.1
son 8
standing 7.8
grandma 7.8
blackboard 7.7
office 7.7
adults 7.6
laughing 7.6
house 7.5
student 7.4
professional 7.4
friendly 7.3
childhood 7.2
hair 7.1
handsome 7.1
working 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
man 97.5
standing 80.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-64
Gender Male, 100%
Happy 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Calm 0%

AWS Rekognition

Age 14-22
Gender Male, 99%
Sad 99.9%
Surprised 7.7%
Fear 6.7%
Confused 5%
Calm 3.7%
Disgusted 3.4%
Angry 2.9%
Happy 2.2%

AWS Rekognition

Age 24-34
Gender Male, 100%
Calm 67.2%
Confused 15.7%
Surprised 10.1%
Fear 6.3%
Sad 5.4%
Disgusted 1.4%
Angry 1.2%
Happy 0.8%

AWS Rekognition

Age 21-29
Gender Male, 93.2%
Sad 59.5%
Angry 19%
Calm 15.4%
Happy 13.5%
Fear 10.8%
Disgusted 9.6%
Surprised 6.8%
Confused 2%

Microsoft Cognitive Services

Age 72
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Boy 99.6%
Child 99.6%
Male 99.6%
Person 99.6%
Adult 99.4%
Man 99.4%
Baby 91.4%

Categories

Imagga

paintings art 47.1%
pets animals 32.6%
people portraits 18.9%

Captions