Human Generated Data

Title

Arkansas Farmer, Squatter near Bakersfield, California

Date

1930s, printed later

People

Artist: Dorothea Lange, American 1895 - 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Sedgewick Memorial Collection, 2.2002.710

Human Generated Data

Title

Arkansas Farmer, Squatter near Bakersfield, California

People

Artist: Dorothea Lange, American 1895 - 1965

Date

1930s, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Sedgewick Memorial Collection, 2.2002.710

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Clothing 99.1
Apparel 99.1
Person 95.4
Face 76.8
Sleeve 71.2
Shorts 69.4
Female 66
Portrait 64
Photography 64
Photo 64
Pants 59.2
Overcoat 58.7
Coat 58.7
Smile 58.2
Wood 57.1
Suit 56.9

Clarifai
created on 2023-10-26

people 99.8
adult 97.9
portrait 97.5
one 97.1
wear 90.7
two 87.4
man 86.3
monochrome 85.1
woman 84.3
actor 83.6
child 79.7
art 77.1
furniture 76.3
administration 75.1
facial expression 74.1
street 72.8
group 71.2
easel 71.1
leader 67.5
chair 67

Imagga
created on 2022-01-22

pillory 48.3
skateboard 44.8
instrument of punishment 42.5
wheeled vehicle 37.4
board 33.7
instrument 30.9
vehicle 28.4
device 23
person 22.5
adult 20.7
conveyance 18.7
male 18.4
man 18.1
women 15
outdoors 14.9
people 14.5
fashion 14.3
one 14.2
activity 13.4
lady 13
body 12
sport 12
exercise 11.8
attractive 11.2
old 11.1
hair 11.1
black 10.8
pretty 10.5
sexy 10.4
style 10.4
sitting 10.3
motion 10.3
lifestyle 10.1
fitness 9.9
posing 9.8
umbrella 9.7
clothing 9.7
jumping 9.7
urban 9.6
sky 9.6
model 9.3
casual 9.3
outdoor 9.2
portrait 9.1
fun 9
cool 8.9
success 8.8
happy 8.8
exercising 8.7
business 8.5
modern 8.4
action 8.3
city 8.3
street 8.3
human 8.2
teenager 8.2
building 7.9
brunette 7.8
face 7.8
dance 7.7
jump 7.7
balance 7.6
elegance 7.6
world 7.6
vintage 7.4
smiling 7.2
looking 7.2
active 7.2
cute 7.2
work 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.1
black and white 87
person 81.3
clothing 79.3
sky 62.7
old 40.3
posing 38.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 52-60
Gender Male, 89.3%
Calm 49.1%
Sad 28.4%
Confused 20.6%
Fear 1%
Angry 0.5%
Disgusted 0.2%
Surprised 0.2%
Happy 0%

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Captions

Microsoft
created on 2022-01-22

an old photo of a person 91.5%
old photo of a person 89.8%
a person posing for a photo 87.7%