Human Generated Data

Title

Untitled (girl putting snowsuit on boy)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17108

Human Generated Data

Title

Untitled (girl putting snowsuit on boy)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17108

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 95.8
Human 95.8
Clothing 89.4
Apparel 89.4
Home Decor 85.6
Suit 66.6
Coat 66.6
Overcoat 66.6
Indoors 66.2
Room 60.7
Photography 60.3
Photo 60.3
Furniture 57.3
Wheel 57.1
Machine 57.1

Clarifai
created on 2023-10-29

people 99.2
monochrome 98.4
music 95.9
woman 94.9
adult 92.5
man 89.7
musician 87.4
indoors 86.9
retro 85.8
child 84
nostalgia 83.4
one 83.1
portrait 82.9
two 82.8
girl 80.7
guitar 79.6
family 77.3
art 74.7
dancing 73
actress 72.6

Imagga
created on 2022-02-26

person 23.6
people 22.3
man 21.5
brass 17.2
adult 15.8
device 15.8
wind instrument 14.9
male 14.2
lifestyle 13
sport 12.3
face 12.1
technology 11.9
art 11.6
active 11.2
portrait 11
bright 10.7
musical instrument 10.6
play 10.3
music 10.3
mask 10.1
player 9.8
health 9.7
costume 9.6
style 9.6
black 9.6
party 9.4
sitting 9.4
modern 9.1
holding 9.1
fashion 9
room 9
design 9
human 9
horn 8.9
medical 8.8
body 8.8
cornet 8.7
instrument 8.6
event 8.3
professional 8.3
competition 8.2
exercise 8.2
team 8.1
celebration 8
home 8
business 7.9
equipment 7.7
luxury 7.7
grunge 7.7
bass 7.5
appliance 7.4
glasses 7.4
sexy 7.2
hand blower 7.2
hair 7.1
smile 7.1
medicine 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 90.6
clothing 87
person 83
cartoon 76.5
dance 64.2
black and white 50.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 94.8%
Happy 91.1%
Surprised 6.2%
Calm 1.3%
Disgusted 0.8%
Fear 0.2%
Angry 0.2%
Sad 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Wheel
Person 95.8%
Wheel 57.1%

Categories

Imagga

paintings art 99%

Captions

Microsoft
created on 2022-02-26

an old photo of a person 48.2%
a person standing in front of a cake 33.8%

Text analysis

Amazon

KODAK-SEELA