Human Generated Data

Title

Untitled (children reading magazines)

Date

c. 1960

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17234

Human Generated Data

Title

Untitled (children reading magazines)

People

Artist: Lucian and Mary Brown, American

Date

c. 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17234

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.1
Human 99.1
Sitting 93.6
Person 93.6
Clothing 70.8
Apparel 70.8
Face 69.6
Portrait 68.1
Photography 68.1
Photo 68.1
Text 66.4
Furniture 63.6
Indoors 61.2
Flooring 60.3
Room 58.7
Bowl 56.8
Person 43.5
Person 43.1

Clarifai
created on 2023-10-28

people 99.9
adult 98.9
furniture 97.8
child 97.7
two 97.1
group 96.9
education 95.1
sit 94.6
table 94.3
man 94.2
woman 93.3
wear 92.7
three 92.7
monochrome 92.3
room 91.9
veil 91.8
one 91.2
concentration 90.6
chair 90.4
desk 89.6

Imagga
created on 2022-02-26

room 30.8
man 30.2
people 23.4
male 22.7
person 21.5
computer 18.5
table 18.3
office 17.4
business 17
businessman 16.8
laptop 16.5
adult 16.3
home 15.9
sitting 15.4
men 14.6
equipment 14.5
chair 14.4
interior 14.1
meeting 14.1
indoors 14
furniture 13.5
desk 13.4
happy 13.1
modern 12.6
floor 12.1
indoor 11.9
work 11.8
health 11.1
worker 10.9
house 10.9
smiling 10.8
hospital 10.7
corporate 10.3
senior 10.3
women 10.3
device 10
working 9.7
window 9.4
lifestyle 9.4
professional 9.2
seat 9.2
team 8.9
teacher 8.9
machine 8.7
communication 8.4
mature 8.4
lamp 8
medicine 7.9
glass 7.8
day 7.8
conference 7.8
wall 7.7
executive 7.7
casual 7.6
relax 7.6
relaxation 7.5
technology 7.4
businesswoman 7.3
group 7.2
job 7.1
decor 7.1
steel 7.1
medical 7.1
clothing 7.1
child 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.5
person 95.4
indoor 93
black and white 91.1
drawing 75.5
clothing 55

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 86.8%
Happy 37.9%
Sad 25.2%
Calm 18.2%
Surprised 8.5%
Confused 4.3%
Angry 2.3%
Disgusted 1.9%
Fear 1.7%

AWS Rekognition

Age 21-29
Gender Male, 66.2%
Calm 99.1%
Sad 0.7%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.1%
Person 93.6%
Person 43.5%
Person 43.1%

Categories

Imagga

paintings art 84.6%
interior objects 13.8%

Captions

Text analysis

Amazon

KODAK-A-ITW

Google

MJI7-- YT37A°2-- XAGO
MJI7--
YT37A°2--
XAGO