Human Generated Data

Title

Untitled (boy and girl on armchair)

Date

1952

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17778

Human Generated Data

Title

Untitled (boy and girl on armchair)

People

Artist: Lucian and Mary Brown, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17778

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.8
Chair 99.7
Person 99.2
Human 99.2
Person 95.1
Interior Design 91.3
Indoors 91.3
Lamp 84.4
Couch 70.5
Room 68.7
Living Room 65
Table Lamp 60
Face 59.4
People 57.2

Clarifai
created on 2023-10-29

people 99.5
adult 97.5
two 97.2
man 95.9
woman 95.2
child 95.2
indoors 93.8
wear 93.6
room 93.5
family 91.6
furniture 91.3
group 89.5
nostalgia 88.8
retro 88.8
three 87.7
chair 87.2
veil 82.7
sit 82.2
offspring 82.2
seat 80.9

Imagga
created on 2022-02-26

sketch 34.9
drawing 31.7
people 17.8
house 16.7
person 16.5
work 15.9
business 15.8
man 15.4
representation 14.5
architecture 14.2
glass 14
building 13.7
home 13.6
male 13.5
worker 12.5
men 12
adult 11.8
office 11.5
businessman 11.5
hand 11.4
design 11.3
human 11.2
technology 11.1
construction 11.1
black 10.8
interior 10.6
structure 10.4
wall 10.4
plan 10.4
modern 9.8
working 9.7
architect 9.7
project 9.6
professional 9.6
window 9.5
paper 9.5
finance 9.3
room 9.2
equipment 8.8
looking 8.8
chart 8.6
old 8.4
success 8
grunge 7.7
head 7.6
film 7.5
investment 7.3
occupation 7.3
group 7.3
computer 7.2
hair 7.1
idea 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

indoor 91.9
text 88.4
black and white 75
old 60.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 9-17
Gender Male, 81.5%
Calm 55.5%
Sad 17.6%
Happy 9.9%
Surprised 5.7%
Angry 4.6%
Fear 2.4%
Confused 2.3%
Disgusted 2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Lamp
Person 99.2%
Person 95.1%
Lamp 84.4%

Categories

Text analysis

Amazon

150