Human Generated Data

Title

Untitled (group eating at indoor picnic table)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2528

Human Generated Data

Title

Untitled (group eating at indoor picnic table)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.8
Person 99.8
Person 99.6
Person 99.5
Person 98.7
Person 98.5
Person 95.6
Furniture 93.4
Person 93.1
Person 90.5
Indoors 86
Room 86
People 74.8
Clothing 74.6
Apparel 74.6
Building 72.9
Table 70.8
Chair 68.9
Coat 66
Overcoat 66
Suit 66
Housing 64.3
Living Room 59.8
Person 59.6
Classroom 57.9
School 57.9
Crowd 57.5
Child 55.2
Kid 55.2

Imagga
created on 2022-03-05

man 30.2
people 28.4
male 27.7
classroom 25.5
group 24.2
business 22.5
room 22.1
building 21.7
men 20.6
women 20.5
person 20.3
school 19.4
adult 19.2
office 19
happy 18.8
chair 18.1
meeting 16.9
work 16.5
team 16.1
teacher 15.6
teamwork 14.8
education 14.7
sitting 14.6
table 13.8
lifestyle 13.7
laptop 13.7
hall 13.5
businessman 13.2
indoors 13.2
corporate 12.9
worker 12.7
conference 12.7
executive 12.6
job 12.4
interior 12.4
friends 12.2
casual 11.9
student 11.6
smiling 11.6
together 11.4
desk 11.3
boy 11.3
businesswoman 10.9
smile 10.7
modern 10.5
success 10.4
two 10.2
communication 10.1
structure 10
portrait 9.7
class 9.6
computer 9.6
businesspeople 9.5
to 8.8
couple 8.7
happiness 8.6
drinking 8.6
board 8.6
glass 8.5
attractive 8.4
manager 8.4
house 8.3
inside 8.3
indoor 8.2
cheerful 8.1
suit 8.1
life 8.1
home 8
window 7.9
working 7.9
20 24 years 7.9
discussion 7.8
boss 7.6
talking 7.6
friendship 7.5
city 7.5
study 7.5
holding 7.4
phone 7.4
blackboard 7.3
children 7.3
architecture 7.2
professional 7.2
employee 7.2
hospital 7.1
love 7.1
kid 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 95.6
clothing 90
black and white 82.4
house 58.8
man 51.1

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 99.9%
Calm 78.8%
Angry 8.1%
Happy 7.8%
Sad 1.7%
Surprised 1.4%
Confused 1.1%
Disgusted 0.7%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people sitting in front of a window 60.2%
a group of people standing in front of a window 60.1%
a group of people in front of a window 60%

Text analysis

Amazon

VE32A2