Human Generated Data

Title

Untitled (men at long counter in lecture room, eating)

Date

1958

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20105

Human Generated Data

Title

Untitled (men at long counter in lecture room, eating)

People

Artist: Peter James Studio, American

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20105

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 98.7
Person 96.9
Person 95.1
Interior Design 91.8
Indoors 91.8
Person 86.4
Restaurant 85.3
Room 81.2
Person 79.3
Person 77.8
Cafeteria 77.2
Person 76.9
Sitting 72.7
Person 68.8
Screen 68.6
Electronics 68.6
Person 66.2
Monitor 65.4
Display 65.4
Classroom 63.5
School 63.5
LCD Screen 61.4
Finger 58.9
Cafe 55.1

Clarifai
created on 2023-10-22

people 99.5
man 97.8
group 97.4
monochrome 97.4
indoors 95.8
adult 95.7
woman 94.3
room 94.1
group together 92.6
sit 91.9
sitting 90.9
desk 88.7
education 88.7
furniture 87.7
chair 86.7
administration 86.4
actor 86.4
many 85.7
classroom 85.4
leader 84.3

Imagga
created on 2022-03-05

hairdresser 100
salon 68.2
man 32.9
people 31.8
shop 27.8
indoors 27.3
barbershop 25.6
male 24.1
person 22.6
interior 22.1
adult 21.4
work 21.2
room 21
home 20
happy 18.8
office 18.5
smiling 17.4
senior 15.9
chair 15.9
mercantile establishment 15.3
table 14.7
occupation 14.7
sitting 14.6
lifestyle 14.5
medical 14.1
patient 13.7
professional 13.6
business 13.4
working 13.3
doctor 13.2
women 12.7
hospital 12.6
happiness 12.5
job 12.4
meeting 12.3
men 12
computer 12
modern 11.9
worker 11.6
smile 11.4
couple 11.3
health 11.1
inside 11
two 11
indoor 11
kitchen 10.7
family 10.7
desk 10.4
portrait 10.4
mature 10.2
place of business 10.2
communication 10.1
holding 9.9
equipment 9.8
medicine 9.7
adults 9.5
technology 8.9
together 8.8
look 8.8
restaurant 8.7
profession 8.6
life 8.6
screen 8.4
treatment 8.3
care 8.2
kid 8
education 7.8
two people 7.8
retired 7.8
illness 7.6
college 7.6
furniture 7.5
child 7.5
clothes 7.5
service 7.4
cheerful 7.3
lady 7.3
laptop 7.3
group 7.3
board 7.2
color 7.2
looking 7.2
team 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.1
person 96.1
indoor 95
clothing 92.3
man 88.1
black and white 81.5
table 58.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 92.4%
Calm 93%
Confused 2.7%
Surprised 1.6%
Sad 0.9%
Angry 0.6%
Disgusted 0.6%
Fear 0.4%
Happy 0.3%

AWS Rekognition

Age 21-29
Gender Male, 84%
Calm 96.5%
Sad 2.9%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0%
Confused 0%

AWS Rekognition

Age 22-30
Gender Male, 58.2%
Calm 94.7%
Sad 3%
Happy 1%
Angry 0.4%
Disgusted 0.3%
Confused 0.3%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 98.7%
Person 96.9%
Person 95.1%
Person 86.4%
Person 79.3%
Person 77.8%
Person 76.9%
Person 68.8%
Person 66.2%

Categories

Captions

Text analysis

Amazon

5
SAFETY
FILM
KODAK SAFETY FILM
KODAK

Google

KODAK SAFETY FILM } 5
KODAK
SAFETY
FILM
}
5