Human Generated Data

Title

Untitled (long line of people inside cafeteria, waiting in line for food)

Date

1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14562

Human Generated Data

Title

Untitled (long line of people inside cafeteria, waiting in line for food)

People

Artist: Jack Gould, American

Date

1946

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14562

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Building 99.8
Assembly Line 99.6
Factory 99.6
Person 99.5
Human 99.5
Person 97.4
Person 97.2
Hat 96.3
Clothing 96.3
Apparel 96.3
Person 95.2
Person 88.4
Person 81.8
Person 80.4
Person 70.6
Person 66
Person 63.4
Cafeteria 59.1
Restaurant 59.1
Manufacturing 58.6
Person 58.3
Person 50.8
Person 43.6
Person 42.7

Clarifai
created on 2023-10-29

people 99.8
group 97.4
many 97.3
adult 96.8
monochrome 94.5
man 94.1
group together 92.9
commerce 92.9
furniture 92.1
woman 90.8
indoors 90
room 89.4
employee 89.1
war 88.9
military 86.3
sit 85.1
administration 83.7
leader 83.6
wear 82.7
education 78.7

Imagga
created on 2022-01-29

newspaper 52.4
product 39.6
creation 30.7
man 30.2
people 27.3
person 25.6
smiling 25.3
sitting 24.9
laptop 23.8
adult 23.2
lifestyle 20.9
computer 20.9
male 19.8
home 18.3
happy 18.2
indoors 17.6
office 17
business 17
room 16.3
cheerful 16.2
casual 16.1
working 15.9
happiness 15.7
work 15
couple 14.8
table 13.8
indoor 13.7
education 13
outdoors 12.8
teacher 12.8
women 12.6
communication 12.6
professional 12.4
businessman 12.4
blackboard 12.2
classroom 12.2
enjoyment 12.2
day 11.8
two people 11.7
portrait 11.6
worker 11.6
desk 11.3
men 11.2
technology 11.1
relaxation 10.9
house 10.9
leisure 10.8
smile 10.7
color 10.6
looking 10.4
togetherness 10.4
resort 10.3
attractive 9.8
job 9.7
group 9.7
together 9.6
talking 9.5
back 9.2
holding 9.1
chair 9
20 24 years 8.8
leisure activity 8.8
mid adult 8.7
30s 8.7
child 8.5
two 8.5
side 8.4
senior 8.4
pretty 8.4
old 8.4
school 8.3
fun 8.2
alone 8.2
one 8.2
relaxing 8.2
handsome 8
love 7.9
typing 7.8
content 7.7
corporate 7.7
modern 7.7
using 7.7
finance 7.6
notebook 7.6
keyboard 7.5
relaxed 7.5
mature 7.4
camera 7.4
musical instrument 7.4
educator 7.3
success 7.2
team 7.2
travel 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 95.3
person 86.8
clothing 83
woman 73.7
clothes 15.7
shop 10.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 92.6%
Happy 61.8%
Sad 31.5%
Calm 4.7%
Fear 0.7%
Confused 0.4%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 18-24
Gender Female, 95.4%
Calm 90.2%
Happy 7.1%
Sad 1%
Angry 0.6%
Confused 0.5%
Fear 0.2%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 21-29
Gender Female, 86.5%
Calm 82.2%
Sad 15.5%
Happy 1.3%
Confused 0.6%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Hat
Person 99.5%
Person 97.4%
Person 97.2%
Person 95.2%
Person 88.4%
Person 81.8%
Person 80.4%
Person 70.6%
Person 66%
Person 63.4%
Person 58.3%
Person 50.8%
Person 43.6%
Person 42.7%
Hat 96.3%

Text analysis

Amazon

WITH