Human Generated Data

Title

Untitled (children in classroom looking at globe)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16835

Human Generated Data

Title

Untitled (children in classroom looking at globe)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 97.8
Room 95.8
Indoors 95.8
Interior Design 95.6
Person 93
Restaurant 92.9
Classroom 91.8
School 91.8
Person 84.7
Person 83.5
Person 83.3
Person 82.1
Furniture 80.3
Chair 77.6
Person 74.7
Cafeteria 73.1
Meal 72.2
Food 72.2
Chair 63.8
Person 62.2
Person 61.4
Couch 59.6
Cafe 58.3
Dining Room 58
Living Room 55.7

Imagga
created on 2022-02-26

room 100
classroom 100
people 27.3
interior 22.1
person 21.4
indoors 21.1
man 20.8
home 20.7
business 18.2
office 17.7
male 17
shop 16.6
men 16.3
table 16.1
barbershop 15.9
house 15
modern 14.7
businessman 14.1
kitchen 13.4
adult 12.7
work 12.3
chair 12.3
meeting 12.2
mercantile establishment 11.5
indoor 10.9
working 10.6
furniture 10.5
floor 10.2
worker 9.8
computer 9.6
education 9.5
smiling 9.4
cheerful 8.9
decor 8.8
medical 8.8
blackboard 8.8
conference 8.8
happy 8.8
women 8.7
lifestyle 8.7
teacher 8.4
clinic 8.4
teamwork 8.3
desk 8.3
window 8.2
laptop 8.2
team 8.1
group 8.1
job 8
sitting 7.7
wall 7.7
exam 7.7
place of business 7.6
study 7.5
inside 7.4
design 7.3
hospital 7.3
businesswoman 7.3
to 7.1
architecture 7
together 7

Google
created on 2022-02-26

Chair 89.2
Window 86.9
Black-and-white 84.5
Style 83.8
Table 79.3
Plant 76.2
Monochrome photography 75
Building 74.1
Monochrome 74.1
Room 70.1
Art 69.9
Houseplant 67
Picture frame 65.3
Event 64.9
Sitting 64.6
History 62.2
Font 60.3
Class 58.4
Illustration 55
Child 51.5

Microsoft
created on 2022-02-26

table 91.4
text 91.2
furniture 90.6
chair 75.7
computer 64
person 55.7
desk 53.4
cluttered 12.6

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Female, 74.2%
Calm 99.7%
Sad 0.1%
Happy 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 45-51
Gender Female, 92.5%
Calm 99.5%
Surprised 0.3%
Sad 0.1%
Disgusted 0%
Fear 0%
Confused 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 12-20
Gender Female, 98.5%
Calm 98.9%
Sad 0.5%
Happy 0.3%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

Feature analysis

Amazon

Person 99.5%
Chair 77.6%

Captions

Microsoft

a group of people playing instruments and performing on a counter 51.8%
a group of people performing on a counter 51.7%
a group of people sitting at a table 51.6%

Text analysis

Amazon

40
the
who
Frank
Heeded the
Heeded
Boys who
Boys
Bulletin
John
Bulletin BASE
B
BASE
I
NERY

Google

Soys who
Soys
who