Human Generated Data

Title

Untitled (girl playing with doll next to Christmas tree)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16855

Human Generated Data

Title

Untitled (girl playing with doll next to Christmas tree)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16855

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Person 98.4
Clothing 83.1
Apparel 83.1
Interior Design 78.2
Indoors 78.2
Car 75.9
Transportation 75.9
Vehicle 75.9
Automobile 75.9
Living Room 74.2
Room 74.2
Female 72.1
Furniture 71.7
Leisure Activities 69.5
Toy 68.1
Musician 67.5
Musical Instrument 67.5
People 64.5
Kid 63
Child 63
Portrait 62.1
Photography 62.1
Face 62.1
Photo 62.1
Drum 58.5
Percussion 58.5
Flooring 58.1
Girl 57.2
Costume 56
Floor 55.2

Clarifai
created on 2023-10-28

people 99.8
monochrome 98.3
one 98
group together 97.5
wear 97.1
adult 96.7
vehicle 95.9
recreation 94.9
child 94.4
two 94.1
street 93.7
woman 93.1
group 91.4
music 90.9
outfit 90.2
actress 88.3
chair 84.1
guitar 83.3
seat 82.7
man 81.8

Imagga
created on 2022-02-26

crutch 37.1
exercise bike 32.2
staff 29.7
stick 25.5
device 25
exercise device 24.4
adult 18.1
urban 16.6
person 16.3
man 16.1
people 15.1
cleaner 14.5
building 13.6
equipment 13.3
city 13.3
support 12.4
portrait 12.3
male 12.1
industry 11.9
interior 11.5
steel 10.6
training 10.2
industrial 10
sport 10
old 9.7
one 9.7
station 9.7
indoors 9.7
metal 9.6
men 9.4
chair 9.4
turnstile 9.4
house 9.2
worker 8.9
working 8.8
military 8.7
lifestyle 8.7
high 8.7
wall 8.5
energy 8.4
help 8.4
health 8.3
occupation 8.2
human 8.2
protection 8.2
transportation 8.1
activity 8.1
machine 8
science 8
clothing 7.9
work 7.8
standing 7.8
power 7.6
fashion 7.5
dark 7.5
floor 7.4
structure 7.4
gate 7.4
window 7.3
gun 7.3
women 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.9
cartoon 80.5
black and white 61.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 76.3%
Calm 98.1%
Angry 1.2%
Sad 0.4%
Happy 0.1%
Surprised 0.1%
Confused 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Person 99.2%
Person 98.4%
Car 75.9%

Categories

Imagga

interior objects 98.5%

Captions

Microsoft
created on 2022-02-26

a person standing in a room 74.6%
a person in a room 72.3%

Text analysis

Amazon

YТ37А-AX

Google

EL
EL