Human Generated Data

Title

Ditched, Stalled, and Stranded. Missouri Farmer in San Joaquin Valley, California, 1936.

Date

1940s

People

Artist: Dorothea Lange, American 1895 - 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.702

Human Generated Data

Title

Ditched, Stalled, and Stranded. Missouri Farmer in San Joaquin Valley, California, 1936.

People

Artist: Dorothea Lange, American 1895 - 1965

Date

1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.702

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 97.3
Human 97.3
Person 96.8
Driving 81
Transportation 81
Vehicle 81
Cushion 75.9
Steering Wheel 61.8
People 57.9
Machine 57.7
Car Seat 57

Clarifai
created on 2023-10-26

people 100
adult 99
military 98.3
vehicle 98.1
war 97.9
two 97.9
portrait 97.6
man 97.5
soldier 95.6
wear 95
one 94.7
transportation system 94.7
child 93.3
three 92.3
group 90.9
leader 87
group together 86.5
aircraft 86.3
uniform 85.8
outfit 85.6

Imagga
created on 2022-01-22

person 30.7
people 24
portrait 23.3
man 20.2
car 19.5
adult 18.8
smile 17.8
attractive 17.5
happy 16.9
male 16.3
fashion 15.1
pretty 14.7
black 14.5
lifestyle 14.4
passenger 14.2
umbrella 14.2
model 14
human 13.5
love 13.4
automobile 13.4
vehicle 13.4
sexy 12.8
dark 12.5
sitting 12
hair 11.9
sensuality 11.8
expression 11.1
happiness 11
face 10.6
brunette 10.4
women 10.3
newspaper 10.1
elevator 9.8
posing 9.8
one 9.7
auto 9.6
smiling 9.4
window 9
device 9
driver 8.7
couple 8.7
youth 8.5
elegance 8.4
holding 8.2
fun 8.2
outdoors 8.2
dress 8.1
cheerful 8.1
looking 8
body 8
billboard 7.9
business 7.9
art 7.9
lifting device 7.8
guy 7.8
men 7.7
skin 7.7
bride 7.7
old 7.7
outdoor 7.6
rain 7.5
product 7.4
blond 7.2
work 7.2
transportation 7.2
clothing 7.1
travel 7
autumn 7
look 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.3
clothing 97.6
human face 96.6
person 94.5
man 88.6
smile 63.5
old 63.4
clothes 15.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 40-48
Gender Male, 100%
Confused 95.5%
Calm 3.6%
Sad 0.4%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%
Happy 0%

AWS Rekognition

Age 35-43
Gender Male, 91.3%
Calm 99.2%
Sad 0.3%
Confused 0.2%
Happy 0.1%
Surprised 0.1%
Fear 0.1%
Angry 0%
Disgusted 0%

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%

Categories

Imagga

paintings art 99.8%

Captions

Microsoft
created on 2022-01-22

an old photo of a man 84.2%