Human Generated Data

Title

Untitled (people eating at long tables at cowboy-themed party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16852

Human Generated Data

Title

Untitled (people eating at long tables at cowboy-themed party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

School 99.7
Room 99.7
Classroom 99.7
Indoors 99.7
Human 99.5
Person 99.5
Person 99
Person 98.3
Person 96.6
Person 96.1
Person 94.2
Person 93.8
Person 88.6
Person 85.2
Person 83
Furniture 81.7
Interior Design 79.2
Person 78.6
Chair 67.6
Person 62.4
Person 61.7
Person 61.7
People 59.7
Table 58.4
Photography 57.7
Photo 57.7
Kindergarten 57.6
Workshop 55.1
Bedroom 55
Person 48.5

Imagga
created on 2022-02-26

room 37.3
man 34.3
people 31.2
businessman 29.1
table 28.6
male 28.4
business 27.9
wind instrument 26.9
group 26.6
meeting 26.4
oboe 26.3
brass 25.5
person 25
salon 24.9
adult 24.1
classroom 23.5
teacher 22.2
office 22.1
sitting 20.6
indoors 20.2
businesswoman 20
team 19.7
businesspeople 19
smiling 18.8
interior 18.6
musical instrument 18.5
professional 18.4
women 18.2
work 18
desk 17.9
chair 17.4
men 17.2
happy 16.9
communication 16.8
home 16.7
together 16.6
lifestyle 16.6
corporate 16.3
teamwork 15.8
executive 15.1
conference 14.7
indoor 14.6
restaurant 14.6
suit 14.4
talking 14.3
couple 13.9
cheerful 13.8
cornet 13.3
colleagues 12.6
job 12.4
smile 12.1
coffee 12
modern 11.9
happiness 11.7
life 11.5
woodwind 11.3
mature 11.2
worker 11.1
holding 10.7
discussion 10.7
family 10.7
workplace 10.5
computer 10.4
education 10.4
educator 10.3
study 10.3
student 10.2
presentation 10.2
laptop 10
success 9.7
class 9.6
casual 9.3
shop 9.3
drink 9.2
20s 9.2
successful 9.1
portrait 9.1
board 9
handsome 8.9
working 8.8
collar 8.6
glass 8.6
barbershop 8.5
learning 8.5
friends 8.4
senior 8.4
employee 8.4
manager 8.4
color 8.3
phone 8.3
confident 8.2
new 8.1
coworkers 7.9
40s 7.8
partner 7.7
30s 7.7
drinking 7.6
two 7.6
ethnic 7.6
hand 7.6
eating 7.6
hall 7.5
fun 7.5
alcohol 7.4
food 7.4
inside 7.4
occupation 7.3
girls 7.3
looking 7.2
romantic 7.1
to 7.1

Google
created on 2022-02-26

Photograph 94.3
Black-and-white 85
Style 83.9
Chair 80.7
Monochrome 74.4
Snapshot 74.3
Monochrome photography 74.2
Event 73.2
Suit 71.7
Room 71.4
Team 69.4
Class 68.5
Stock photography 66.8
Building 66.4
T-shirt 65.9
Crowd 65
Window 63.5
History 62.2
Sitting 59.9
Education 57.7

Microsoft
created on 2022-02-26

person 96.8
text 96.1
clothing 89.6
man 84
black and white 75.9
group 57.7

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 81.2%
Calm 64.5%
Happy 28.9%
Sad 1.5%
Confused 1.5%
Disgusted 1.2%
Surprised 1%
Angry 0.9%
Fear 0.4%

AWS Rekognition

Age 23-33
Gender Male, 92.1%
Sad 70.4%
Confused 14.4%
Fear 6.7%
Disgusted 2.6%
Calm 2.4%
Happy 1.3%
Surprised 1.2%
Angry 1%

AWS Rekognition

Age 19-27
Gender Male, 98.7%
Sad 86.4%
Calm 7.4%
Confused 2.8%
Happy 1.5%
Fear 0.7%
Disgusted 0.6%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 19-27
Gender Female, 51.2%
Calm 98.9%
Sad 0.8%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Surprised 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 45-51
Gender Female, 87.8%
Sad 64.6%
Calm 34.4%
Confused 0.4%
Happy 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people in a room 96.6%
a group of people sitting at a table 86.6%
a group of people standing in a room 86.5%