Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3123

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3123

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.9
Human 99.9
Person 99.8
Person 99.5
Outdoors 87
Nature 86.1
Clothing 84.4
Apparel 84.4
Housing 68.8
Building 68.8
Door 67.1
People 63.4
Porch 62.6
Countryside 58
Shorts 55.6

Clarifai
created on 2023-10-15

people 100
group together 99.1
group 98.9
man 98.7
three 97.8
adult 97.1
two 96.3
administration 95.8
family 95.1
woman 93
four 91.9
home 91.2
child 90.7
war 89.1
leader 88.4
street 88.3
soldier 86.6
portrait 86.2
monochrome 82.3
collage 82.1

Imagga
created on 2021-12-15

man 39.6
male 33.4
people 26.8
musical instrument 26.5
person 21.5
wind instrument 17.3
couple 16.5
device 16.3
adult 15.9
two 14.4
home 13.6
outdoors 13.4
happy 13.2
boy 13
child 12.8
human 12.7
business 12.1
men 12
stringed instrument 12
violin 11.9
youth 11.9
gun 11.3
weapon 11.3
smiling 10.8
outdoor 10.7
hand 10.6
together 10.5
standing 10.4
brass 10.3
love 10.3
day 10.2
bowed stringed instrument 10
instrument 10
leisure 10
building 9.9
horn 9.9
park 9.9
portrait 9.7
guy 9.3
smile 9.3
joy 9.2
family 8.9
businessman 8.8
lifestyle 8.7
happiness 8.6
husband 8.6
tool 8.3
playing 8.2
danger 8.2
office 8.1
suit 8.1
activity 8.1
romance 8
work 7.8
outside 7.7
casual 7.6
photographer 7.6
wife 7.6
togetherness 7.5
fashion 7.5
professional 7.5
holding 7.4
teen 7.3
cheerful 7.3
teenager 7.3
group 7.2
room 7.2
black 7.2
summer 7.1
indoors 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

outdoor 98.8
clothing 98.5
person 98
standing 94.8
text 94.5
man 91.4
group 62.7
posing 54.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-42
Gender Male, 92.6%
Calm 86.2%
Angry 6.1%
Happy 3.8%
Sad 1.6%
Disgusted 1.3%
Surprised 0.3%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 42-60
Gender Male, 99.3%
Angry 47.7%
Calm 34.4%
Confused 7.2%
Disgusted 5.4%
Sad 2.5%
Happy 1.6%
Surprised 0.9%
Fear 0.3%

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%