Human Generated Data

Title

Untitled (people in theater audience watching show; three boys in front)

Date

1954

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14317

Human Generated Data

Title

Untitled (people in theater audience watching show; three boys in front)

People

Artist: Jack Gould, American

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Interior Design 100
Indoors 100
Furniture 99.6
Human 99
Person 99
Person 98
Person 98
Person 97.9
Person 97.8
Person 95.9
Room 94.8
Person 93
Person 85.9
Person 84.2
Crowd 78.4
Sitting 74.6
Theater 68.6
People 67.6
Audience 67.5
Female 65
Accessory 63
Accessories 63
Glasses 63
Clothing 58.2
Apparel 58.2
Face 57.4
Suit 57.3
Coat 57.3
Overcoat 57.3
Cinema 55.4
Chair 55.3
Chair 51.1

Imagga
created on 2022-01-29

seat 65.7
support 49.5
car 46.6
device 36.2
plane seat 33.5
sitting 27.5
vehicle 25.6
automobile 23
driver 22.3
man 21.5
adult 21.4
transportation 20.6
modern 19.6
person 19.4
driving 19.3
auto 19.1
people 19
laptop 18.1
business 17.6
drive 17
interior 16.8
attractive 16.8
rest 16.7
chair 16.5
happy 15.7
wheel 15.3
male 14.9
inside 14.7
transport 14.6
computer 14
pretty 14
smiling 13
portrait 12.9
home 12.7
happiness 12.5
office 12.4
new 12.1
smile 12.1
work 11.8
couch 11.6
indoors 11.4
room 11.1
luxury 11.1
upholstery 11.1
window 11
lifestyle 10.8
headrest 10.8
cheerful 10.6
travel 10.6
professional 10.2
house 10
technology 9.6
looking 9.6
couple 9.6
clothing 9.5
model 9.3
casual 9.3
businesswoman 9.1
human 9
armrest 9
one 9
lady 8.9
sexy 8.8
rumble 8.7
sit 8.5
furniture 8.4
fashion 8.3
road 8.1
working 7.9
businessman 7.9
cute 7.9
motor vehicle 7.8
face 7.8
motor 7.7
elegant 7.7
sofa 7.6
living 7.6
horizontal 7.5
senior 7.5
black 7.3
suit 7.2
hair 7.1
women 7.1
job 7.1
equipment 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 98.6
text 86.1
chair 82.7
black and white 59.4
clothing 56.9
furniture 53
crowd 0.6

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 73.4%
Happy 42.7%
Calm 38.7%
Fear 8%
Sad 7.5%
Disgusted 1.1%
Confused 0.8%
Angry 0.7%
Surprised 0.6%

AWS Rekognition

Age 27-37
Gender Male, 98.2%
Calm 99.4%
Surprised 0.3%
Sad 0.2%
Disgusted 0%
Fear 0%
Confused 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 31-41
Gender Female, 72.4%
Calm 79.2%
Sad 6.2%
Happy 5.2%
Fear 3.8%
Confused 1.7%
Disgusted 1.5%
Angry 1.3%
Surprised 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Feature analysis

Amazon

Person 99%
Chair 55.3%

Captions

Microsoft

a group of people sitting in chairs 95.2%
a group of people sitting in a chair 94%
a group of people sitting on a bus 61.3%

Text analysis

Amazon

So