Human Generated Data

Title

Untitled (banquet with long tables)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20192

Human Generated Data

Title

Untitled (banquet with long tables)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20192

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Audience 99
Crowd 99
Person 98.4
Person 97.8
Indoors 97.2
Person 97
Person 95.2
Room 94.5
Person 94.1
Person 90.6
Interior Design 84.6
Hall 80.6
Auditorium 78.8
Theater 78.8
Person 78.5
Person 76.4
People 73.1
Person 72.8
Lecture 65.7
Speech 65.7
Person 65.3
Person 62.8
Person 61.6
Person 61
Meeting Room 59.5
Conference Room 59.5
Seminar 57
Classroom 57
School 57
Waiting Room 56.1
Reception Room 55.4
Furniture 55.4
Reception 55.4
Person 53.6
Person 47.4

Clarifai
created on 2023-10-22

people 99.8
group 98.3
education 98
many 97.9
indoors 97.2
man 96.7
school 96.4
wedding 96.2
woman 95.5
child 95.4
adult 92.8
ceremony 91.2
group together 90.8
leader 90.3
room 88.7
monochrome 88.6
boy 88.2
administration 86.6
chair 85.8
furniture 77.6

Imagga
created on 2022-03-05

room 31.4
hall 30
man 26.2
table 25.9
teacher 25.1
interior 24.7
people 24
person 23.6
restaurant 23.2
chair 19.9
entrepreneur 18.5
indoors 18.4
classroom 17.9
home 16.7
adult 16.3
couple 14.8
life 14.5
professional 14.3
educator 14.3
male 14.2
indoor 13.7
student 13.4
modern 13.3
men 12.9
hospital 12.4
meeting 12.2
senior 12.2
women 11.8
dinner 11.8
communication 11.7
furniture 11.4
group 11.3
party 11.2
center 11
house 10.9
lifestyle 10.8
chairs 10.8
education 10.4
smiling 10.1
happy 10
design 9.6
building 9.5
sitting 9.4
office 9.4
service 9.2
school 9.2
occupation 9.2
business 9.1
old 9
cafeteria 8.9
lunch 8.9
crowd 8.6
work 8.6
elderly 8.6
hotel 8.6
dining 8.6
food 8.4
enjoyment 8.4
executive 8.1
team 8.1
family 8
together 7.9
happiness 7.8
glass 7.8
empty 7.7
class 7.7
comfortable 7.6
talking 7.6
desk 7.6
togetherness 7.5
eat 7.5
relaxation 7.5
stage 7.5
floor 7.4
technology 7.4
teamwork 7.4
care 7.4
light 7.3
cheerful 7.3
computer 7.2
suit 7.2
love 7.1
decor 7.1
businessman 7.1
architecture 7

Google
created on 2022-03-05

Photograph 94.2
Black 89.8
Black-and-white 85.9
Style 84
Line 82.3
Font 79.1
Monochrome 78.3
Monochrome photography 77.7
Crowd 76.4
Snapshot 74.3
Event 73.9
Art 70.8
Suit 69.6
Room 66.7
History 66.3
Stock photography 65
Audience 62.2
Photo caption 56.1
Team 54
Paper product 51.1

Microsoft
created on 2022-03-05

person 99.7
outdoor 91.6
man 79
clothing 69.6
people 65.4
group 63.4
crowd 56.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 53.7%
Calm 46.5%
Sad 22.2%
Confused 11.4%
Fear 6.7%
Surprised 4.5%
Disgusted 3.2%
Happy 2.8%
Angry 2.6%

AWS Rekognition

Age 19-27
Gender Female, 62.2%
Calm 97.6%
Sad 2%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 98.7%
Sad 0.5%
Confused 0.4%
Disgusted 0.2%
Happy 0.1%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 88.8%
Calm 99.6%
Sad 0.3%
Angry 0%
Happy 0%
Confused 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 82.3%
Calm 97.4%
Confused 1%
Happy 0.5%
Sad 0.3%
Surprised 0.3%
Disgusted 0.3%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 16-24
Gender Male, 99.3%
Calm 54.5%
Sad 33.2%
Happy 5.4%
Angry 2.8%
Confused 2.3%
Fear 1.2%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 25-35
Gender Male, 79.3%
Sad 67.3%
Calm 28.1%
Confused 1.7%
Happy 0.9%
Surprised 0.7%
Fear 0.6%
Disgusted 0.4%
Angry 0.3%

AWS Rekognition

Age 20-28
Gender Female, 70.2%
Calm 98.2%
Sad 0.8%
Happy 0.4%
Confused 0.2%
Disgusted 0.2%
Surprised 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 48-56
Gender Male, 96.5%
Sad 41.8%
Calm 37.8%
Disgusted 6.5%
Fear 5%
Happy 2.7%
Confused 2.6%
Angry 2.4%
Surprised 1.1%

AWS Rekognition

Age 24-34
Gender Male, 97.8%
Calm 81.2%
Sad 11.8%
Happy 2.2%
Fear 2%
Disgusted 0.9%
Angry 0.9%
Confused 0.8%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 98.4%
Person 97.8%
Person 97%
Person 95.2%
Person 94.1%
Person 90.6%
Person 78.5%
Person 76.4%
Person 72.8%
Person 65.3%
Person 62.8%
Person 61.6%
Person 61%
Person 53.6%
Person 47.4%

Text analysis

Amazon

5
KODVRVEELA