Human Generated Data

Title

Untitled (children sitting at table and eating, mothers helping)

Date

1958

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16463

Human Generated Data

Title

Untitled (children sitting at table and eating, mothers helping)

People

Artist: Lucian and Mary Brown, American

Date

1958

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.6
Human 99.6
Person 99.4
Person 98.6
Person 98
Person 97.7
Helmet 96.4
Apparel 96.4
Clothing 96.4
Furniture 93.2
Chair 93.2
Person 92.9
Person 91.6
Indoors 89.6
Room 89.6
Person 86.7
Restaurant 82.8
Sunglasses 81.8
Accessory 81.8
Accessories 81.8
Classroom 76.5
School 76.5
Person 75.3
Table 73.3
Dining Table 73.3
Cafeteria 73.2
People 68.3
Meal 63.3
Food 63.3
Dining Room 62
Girl 61
Female 61
Workshop 58.9

Imagga
created on 2022-02-11

person 31.6
man 30.2
brass 27.8
sax 25.7
people 25.6
wind instrument 25.3
male 24.8
work 23.5
adult 20.6
medical 19.4
student 17.3
musical instrument 16.6
room 16.5
science 16
patient 15.9
medicine 15.8
human 15.7
education 15.6
working 15
business 14.6
specialist 14.5
worker 14.2
businessman 14.1
health 13.9
classroom 13.8
portrait 13.6
technology 13.3
job 13.3
senior 13.1
men 12.9
chemistry 12.6
laboratory 12.5
case 12.1
professional 12.1
equipment 12.1
group 12.1
trombone 11.7
cornet 11.6
test 11.5
doctor 11.3
occupation 11
clinic 10.8
lab 10.7
research 10.5
biology 10.4
looking 10.4
teacher 10.3
black 10.2
happy 10
care 9.9
team 9.8
nurse 9.8
office 9.8
chair 9.8
chemical 9.7
sitting 9.4
instrument 9.4
teamwork 9.3
blackboard 9.1
coat 9
scientist 8.8
computer 8.8
teaching 8.8
smiling 8.7
plan 8.5
old 8.4
device 8.1
suit 8.1
hospital 8.1
uniform 8
iron lung 7.9
chemist 7.9
smile 7.8
engineer 7.8
space 7.7
scientific 7.7
stage 7.7
class 7.7
modern 7.7
construction 7.7
hand 7.6
sick person 7.5
music 7.5
holding 7.4
school 7.3

Google
created on 2022-02-11

Motor vehicle 86.6
Chair 76.9
Snapshot 74.3
Crew 72.9
T-shirt 71.5
Musician 70.5
Event 69.7
Monochrome 65.5
Monochrome photography 65.3
Art 62.5
Machine 61.6
Room 60.8
Team 60.4
Sitting 59.8
Font 55.4
History 54.3
Visual arts 54.3
Vintage clothing 53.9
Music 50

Microsoft
created on 2022-02-11

text 99.7
person 99.4
clothing 94.1
man 91.7
people 72.5
group 70.8
concert 56.3
musical instrument 51.1

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 85.6%
Calm 95.7%
Surprised 1.4%
Happy 1.3%
Sad 0.6%
Disgusted 0.3%
Angry 0.3%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 97.7%
Surprised 57.8%
Angry 12.8%
Calm 8.6%
Fear 6.8%
Happy 6.8%
Sad 2.9%
Disgusted 2.6%
Confused 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Helmet 96.4%
Chair 93.2%
Sunglasses 81.8%

Captions

Microsoft

a group of people around each other 91%
a group of people posing for a photo 89.2%
a group of people pose for a photo 89.1%

Text analysis

Amazon

28

Google

YT
28 MJIA- - YT RA°2- -NAGON
28
MJIA-
-
RA°2-
-NAGON