Human Generated Data

Title

Untitled (scientists projecting image on screen)

Date

1954, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.143

Human Generated Data

Title

Untitled (scientists projecting image on screen)

People

Artist: Jack Gould, American

Date

1954, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.143

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 99.1
Person 95.2
Indoors 92.3
Room 89.9
White Board 82.8
Classroom 77.8
School 77.8
Crowd 57
Audience 57

Clarifai
created on 2023-10-25

desk 99.7
people 99
adult 97.7
woman 97.7
chair 97.3
man 96.1
table 95.8
group 95.5
room 95.4
furniture 95
monochrome 94.5
indoors 92.7
education 89.3
sit 86.5
classroom 85.9
office 85.9
league 85.7
group together 84.3
child 83.4
three 82.6

Imagga
created on 2021-12-14

office 51.2
classroom 42.6
room 42.4
computer 39.8
man 39
businessman 38
business 37.1
professional 36.6
male 36.2
person 34.5
desk 34.3
laptop 33
people 32.9
teacher 32.4
meeting 31.1
work 30.6
corporate 30.1
adult 28.1
table 27.7
businesswoman 27.3
group 26.6
sitting 24.9
working 24.8
indoors 22.9
team 22.4
men 22.3
executive 21.9
businesspeople 21.8
job 21.2
communication 21
smiling 20.3
happy 20.1
restaurant 19.7
teamwork 19.5
education 19.1
women 19
worker 17.9
workplace 17.2
building 16.8
manager 15.8
smile 15.7
colleagues 15.5
indoor 15.5
presentation 14.9
modern 14.7
educator 14.7
looking 14.4
talking 14.3
student 14.1
technology 14.1
together 14
chair 13.5
notebook 13.4
occupation 12.8
conference 12.7
interior 12.4
mature 12.1
board 11.8
suit 11.7
career 11.4
adults 11.4
learning 11.3
success 11.3
home 11.2
document 11.1
color 11.1
lifestyle 10.8
director 10.7
corporation 10.6
cheerful 10.6
school 10.3
study 10.3
happiness 10.2
structure 10.1
associates 9.8
computer network 9.8
discussion 9.7
partners 9.7
portrait 9.7
network 9.7
class 9.6
paper 9.4
two 9.3
successful 9.2
confident 9.1
seminar 8.9
students 8.8
consultant 8.8
couple 8.7
busy 8.7
exam 8.6
hairdresser 8.5
expression 8.5
pen 8.5
contemporary 8.5
keyboard 8.5
focus 8.3
phone 8.3
handsome 8
salon 7.9
hall 7.9
employee 7.8
30s 7.7
diversity 7.7
entrepreneur 7.6
gesture 7.6
reading 7.6
college 7.6
females 7.6
horizontal 7.5
showing 7.5
clothes 7.5
coffee 7.4
engineer 7.4
monitor 7.1
to 7.1
specialist 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 98.1
indoor 98.1
text 97.6
person 97
table 91.3
laptop 82.7
desk 80.8
furniture 78.5
clothing 71
black and white 67.7
office 54.7
computer 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-43
Gender Male, 91.1%
Calm 88.2%
Surprised 5.8%
Happy 2.6%
Sad 1.6%
Angry 1.2%
Confused 0.2%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 34-50
Gender Male, 93.8%
Calm 99.6%
Sad 0.2%
Surprised 0.1%
Fear 0%
Happy 0%
Angry 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 27-43
Gender Male, 81.9%
Calm 69.6%
Sad 26.5%
Angry 0.9%
Fear 0.7%
Surprised 0.7%
Confused 0.7%
Happy 0.6%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

interior objects 88.2%
food drinks 9.8%
paintings art 1.2%