Human Generated Data

Title

Untitled (father helping child get dressed)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17248

Human Generated Data

Title

Untitled (father helping child get dressed)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17248

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Person 97.2
Shoe 96.4
Footwear 96.4
Clothing 96.4
Apparel 96.4
Shoe 90.9
Room 88.5
Indoors 88.5
Furniture 83
Flooring 80.7
Shoe 79.7
Interior Design 77.5
Floor 74.8
Living Room 71.8
Chair 68.9
Photography 62.4
Photo 62.4
Wood 61.8
People 60.6
Home Decor 59.3
Portrait 58.5
Face 58.5
Blonde 57.9
Teen 57.9
Kid 57.9
Girl 57.9
Woman 57.9
Child 57.9
Female 57.9

Clarifai
created on 2023-10-29

people 100
two 98
monochrome 97.6
child 97.6
adult 97.5
woman 97.5
man 96.6
street 96.5
group 95.7
wear 95.1
group together 90.7
recreation 87.7
three 87.5
family 84
four 83
administration 82.6
offspring 80.7
kneeling 79.6
boy 79.5
several 78.7

Imagga
created on 2022-02-26

man 28.9
male 26.2
person 26.2
people 24.5
business 21.9
adult 21.5
businessman 20.3
office 18.3
corporate 18
men 17.2
professional 16.7
work 16.7
executive 16.6
suit 14.8
cleaner 14.5
worker 14.3
room 13.7
teacher 13.6
fashion 12.8
old 12.5
portrait 12.3
happy 11.9
job 11.5
working 11.5
indoors 11.4
black 11.4
smile 11.4
career 11.4
barbershop 11.3
life 11.2
building 10.8
standing 10.4
educator 10.3
clothes 10.3
clothing 10.3
looking 9.6
briefcase 9.1
one 9
shop 8.9
day 8.6
wall 8.5
casual 8.5
attractive 8.4
city 8.3
success 8
home 8
interior 8
lifestyle 7.9
love 7.9
couple 7.8
happiness 7.8
pretty 7.7
bride 7.7
groom 7.6
hand 7.6
outdoors 7.5
company 7.4
street 7.4
wedding 7.4
occupation 7.3
confident 7.3
smiling 7.2
mercantile establishment 7.1
handsome 7.1
family 7.1
staff 7
device 7
bag 7
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.4
clothing 95.2
person 93.1
man 88.4
footwear 84.8
black and white 70.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 62%
Calm 97%
Sad 1.5%
Happy 0.7%
Confused 0.2%
Disgusted 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.2%
Person 97.2%
Shoe 96.4%
Shoe 90.9%
Shoe 79.7%

Categories

Text analysis

Amazon

CARE
SERI
DO N
NAC
DP
MOOL
MANDLE
IX
٢٥
IRAS