Human Generated Data

Title

Untitled (baby playing with blocks)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17027

Human Generated Data

Title

Untitled (baby playing with blocks)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17027

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98
Human 98
Furniture 88
Game 77.6
Flooring 75.9
Floor 63
Chess 60.5

Clarifai
created on 2023-10-29

people 99.8
child 99.3
monochrome 99.1
indoors 97.2
furniture 96.6
room 96.1
woman 96
one 95.9
adult 95.8
man 95.7
boy 95
girl 94.9
sit 94.1
portrait 91
music 90.4
chair 88.6
seat 87.9
school 86.9
education 86.2
family 85.2

Imagga
created on 2022-02-26

laptop 68
computer 46.3
business 34
work 33.7
notebook 32.6
working 30
office 29.8
grand piano 29.1
people 29
adult 28
sitting 27.5
person 24.9
happy 24.4
man 24.2
technology 23.7
piano 23.4
seat 22.5
attractive 21.7
smile 21.4
businesswoman 20.9
student 20.8
chair 20
briefcase 19.8
corporate 19.8
pretty 19.6
lifestyle 19.5
casual 19.5
male 19.1
professional 18.9
portable computer 18.6
home 18.3
keyboard instrument 18.2
women 18.2
percussion instrument 18.2
portrait 18.1
smiling 18.1
stringed instrument 17.8
lady 16.2
desk 15.3
worker 15.1
furniture 14.7
table 14.4
looking 14.4
musical instrument 14.2
businessman 14.1
indoors 14.1
personal computer 14.1
executive 14
study 14
success 13.7
suit 13.5
job 13.3
support 12.8
sit 12.3
cheerful 12.2
face 12.1
happiness 11.7
wireless 11.4
boy 11.3
education 11.3
modern 11.2
relax 10.9
typing 10.7
couch 10.6
interior 10.6
sexy 10.4
armchair 10.1
successful 10.1
cute 10
teenager 10
device 9.9
fashion 9.8
one 9.7
digital computer 9.4
gramophone 9.3
two 9.3
house 9.2
car 9
hair 8.7
secretary 8.7
automobile 8.6
employee 8.6
men 8.6
career 8.5
youth 8.5
keyboard 8.4
joy 8.3
equipment 8.3
leisure 8.3
phone 8.3
book 8.2
indoor 8.2
child 7.8
using 7.7
room 7.7
machine 7.6
finance 7.6
hand 7.6
communication 7.6
record player 7.5
ottoman 7.4
handsome 7.1
look 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.2
black and white 92.1
person 66.1
music 55.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-12
Gender Female, 95.9%
Sad 48%
Calm 39.6%
Happy 9.1%
Fear 1.6%
Confused 0.5%
Angry 0.5%
Disgusted 0.4%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98%

Categories

Captions

Text analysis

Amazon

KODVK-EVEETX

Google

CVEEIA KODVR
CVEEIA
KODVR