Human Generated Data

Title

Untitled (group portrait of participants in first aid class posing in slings or bandages)

Date

c. 1930-1945

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10949

Human Generated Data

Title

Untitled (group portrait of participants in first aid class posing in slings or bandages)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1930-1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.8
Human 98.8
Person 98.4
Person 97.9
Person 97.4
Person 97.1
Person 96.6
Apparel 96.6
Clothing 96.6
Person 96.3
Furniture 94.6
Chair 94.6
Person 86
Shorts 73.4
Person 70.9
Nature 70
Person 67.1
Shoe 63.2
Footwear 63.2
Photo 60.7
Portrait 60.7
Photography 60.7
Face 60.7
Shoe 59.1
Sleeve 57.8
Finger 56.7
Pants 56.2

Imagga
created on 2022-02-05

man 36.9
male 36.2
person 35.9
brass 31.9
people 31.2
wind instrument 29.9
professional 26.4
room 26.2
teacher 26.2
businessman 25.6
adult 25.1
business 23.7
men 23.2
musical instrument 22.3
happy 21.9
couple 20.9
office 20.9
women 20.6
group 19.3
worker 19.1
corporate 17.2
meeting 17
work 16.5
table 16.4
team 16.1
desk 16.1
smiling 15.9
executive 14.7
modern 14.7
businesswoman 14.5
home 14.4
smile 14.2
indoors 14.1
together 14
sitting 13.7
cornet 13.7
educator 13.6
talking 13.3
job 13.3
manager 13
nurse 12.8
portrait 12.3
sax 12.3
lifestyle 12.3
teamwork 12
communication 11.8
happiness 11.7
conference 11.7
interior 11.5
dancer 11.2
mature 11.2
suit 10.8
boy 10.4
black 10.2
two 10.2
indoor 10
board 9.9
handsome 9.8
new 9.7
diversity 9.6
education 9.5
businesspeople 9.5
classroom 9.4
patient 9.4
performer 9.4
successful 9.1
hand 9.1
barbershop 8.9
working 8.8
computer 8.8
shop 8.7
laughing 8.5
finance 8.4
senior 8.4
study 8.4
presentation 8.4
chair 8.3
cheerful 8.1
dress 8.1
success 8
family 8
medical 7.9
hospital 7.9
employee 7.9
diverse 7.8
colleagues 7.8
student 7.7
attractive 7.7
boss 7.6
casual 7.6
ethnic 7.6
device 7.5
fun 7.5
trombone 7.5
care 7.4
phone 7.4
lady 7.3
girls 7.3
laptop 7.3
confident 7.3
hall 7.1
entertainer 7

Google
created on 2022-02-05

Art 83.4
Hat 82.5
Chair 75.9
Monochrome 73.6
Vintage clothing 72.1
Font 71
Event 70
Painting 67.9
Crew 67.9
Room 66.8
Visual arts 66.5
Team 66.4
Monochrome photography 66.2
Illustration 65.9
Stock photography 63
Sitting 55.8
Uniform 55.1
Photo caption 53
Drawing 52.9
History 52.8

Microsoft
created on 2022-02-05

person 98
man 93.4
text 90.5
outdoor 86.7
clothing 84.1
old 56.3
furniture 52.5
posing 50.9

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Calm 38.8%
Happy 22.7%
Surprised 13.1%
Sad 10.6%
Angry 6.9%
Confused 5.1%
Disgusted 1.8%
Fear 1.2%

AWS Rekognition

Age 48-54
Gender Male, 97%
Calm 66.8%
Surprised 20.9%
Sad 6.5%
Fear 2%
Happy 1.3%
Angry 1.1%
Confused 0.7%
Disgusted 0.7%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Happy 55.4%
Surprised 19.2%
Sad 10.6%
Angry 4.4%
Disgusted 3.7%
Calm 2.7%
Fear 2.5%
Confused 1.5%

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Calm 61.9%
Happy 28.6%
Surprised 6.8%
Sad 1.1%
Disgusted 0.6%
Angry 0.5%
Confused 0.3%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Female, 93%
Calm 98.8%
Sad 0.9%
Fear 0.1%
Happy 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 19-27
Gender Male, 99.9%
Calm 57.7%
Happy 14.5%
Angry 6.4%
Sad 5.9%
Surprised 5%
Disgusted 4.3%
Confused 4.2%
Fear 1.9%

AWS Rekognition

Age 56-64
Gender Male, 99.9%
Calm 58.8%
Angry 11.5%
Happy 10.9%
Surprised 6.9%
Sad 4.8%
Disgusted 3.8%
Confused 2.4%
Fear 0.9%

AWS Rekognition

Age 37-45
Gender Male, 65.6%
Calm 36.1%
Surprised 24.6%
Happy 24.3%
Fear 4.4%
Angry 4%
Confused 2.5%
Sad 2.1%
Disgusted 1.9%

AWS Rekognition

Age 37-45
Gender Female, 96.3%
Calm 97%
Surprised 1.5%
Sad 0.6%
Angry 0.4%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Chair 94.6%
Shoe 63.2%

Captions

Microsoft

a group of people posing for a photo 87.7%
a group of people posing for a picture 87.6%
a group of people posing for the camera 87.5%

Text analysis

Amazon

-
Aid
Instruction
The Aid Instruction Chart
- for Cross
Chart
for
The
Cross