Human Generated Data

Title

Untitled (long tables of guests at banquet)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20191

Human Generated Data

Title

Untitled (long tables of guests at banquet)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20191

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Audience 99.6
Crowd 99.6
Human 99.6
Person 99.3
Person 99
Indoors 98.9
Person 98.8
Room 98.5
Person 98.4
Person 98.1
Person 93
Interior Design 92.7
Person 92
Classroom 89.2
School 89.2
Person 88.2
Person 83.5
Person 82.6
Clinic 81.8
Person 81.3
Person 80
Lecture 75.2
Speech 75.2
Hall 75.1
Person 71.4
Meeting Room 69.6
Conference Room 69.6
Person 69.2
People 68.6
Tie 67.1
Accessories 67.1
Accessory 67.1
Person 66.4
Auditorium 64.9
Theater 64.9
Seminar 57.5
Waiting Room 55.5
Furniture 55.4

Clarifai
created on 2023-10-22

people 99.6
many 98.6
group 98
wedding 97.7
indoors 97.4
man 96.3
woman 94.8
room 92.4
adult 92.4
group together 92.3
monochrome 89.4
ceremony 89.3
education 88.9
child 87
music 85.7
school 85.5
crowd 83.5
chair 82.9
boy 82.5
administration 81.7

Imagga
created on 2022-03-05

man 31.6
room 26
people 25.1
teacher 25
adult 24.4
hall 23.5
person 23
couple 21.8
senior 21.5
home 21.5
male 21.3
table 20.8
restaurant 20.7
indoors 20.2
interior 18.6
men 18
happy 17.5
educator 17.3
life 16.8
hospital 15.7
indoor 15.5
elderly 15.3
chair 15.1
professional 14.5
together 14
women 13.4
old 12.5
sitting 12
love 11.8
smiling 11.6
lifestyle 11.6
husband 11.4
modern 11.2
mature 11.1
celebration 10.4
meeting 10.4
dinner 10.2
happiness 10.2
classroom 10.1
entrepreneur 10
care 9.9
family 9.8
cheerful 9.7
enjoying 9.5
enjoyment 9.4
smile 9.3
occupation 9.2
group 8.9
nurse 8.8
medical 8.8
chairs 8.8
worker 8.6
married 8.6
glass 8.6
business 8.5
friends 8.4
portrait 8.4
communication 8.4
drink 8.3
holding 8.2
student 7.9
education 7.8
two people 7.8
retired 7.7
retirement 7.7
groom 7.7
drinking 7.6
office 7.6
health 7.6
hotel 7.6
furniture 7.6
clothing 7.6
wife 7.6
desk 7.6
togetherness 7.5
grandfather 7.5
human 7.5
fun 7.5
help 7.4
wedding 7.3
aged 7.2
lunch 7.1
to 7.1
work 7.1

Google
created on 2022-03-05

Photograph 94.2
Black 89.7
Black-and-white 85.2
Style 83.8
Line 82.1
Monochrome 79
Crowd 78.3
Monochrome photography 78.1
Snapshot 74.3
Event 73.3
Hat 70.4
Font 67.6
Art 67.2
Room 64.5
Stock photography 64.2
History 61.5
Audience 60
Ceiling 58.8
T-shirt 58.5
Cap 55.8

Microsoft
created on 2022-03-05

person 99.3
outdoor 90.6
people 60.5
group 57.1
man 54.7
crowd 52.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 97.4%
Calm 94.9%
Sad 2.8%
Disgusted 0.8%
Happy 0.7%
Confused 0.3%
Fear 0.2%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.1%
Calm 99.1%
Surprised 0.4%
Confused 0.3%
Sad 0.1%
Disgusted 0.1%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Female, 66.9%
Calm 64.8%
Sad 29.7%
Confused 4%
Fear 0.4%
Surprised 0.4%
Disgusted 0.4%
Happy 0.2%
Angry 0.2%

AWS Rekognition

Age 36-44
Gender Male, 94.5%
Calm 87.5%
Sad 8.5%
Confused 1.6%
Happy 1.4%
Fear 0.3%
Angry 0.3%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 36-44
Gender Male, 95.4%
Calm 81.5%
Confused 10.7%
Disgusted 3.5%
Sad 2.3%
Surprised 0.7%
Happy 0.5%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Female, 85.8%
Calm 69.6%
Sad 12.4%
Happy 9.1%
Fear 3.5%
Angry 1.8%
Disgusted 1.7%
Confused 1.4%
Surprised 0.5%

AWS Rekognition

Age 49-57
Gender Female, 60.3%
Sad 98.4%
Angry 0.5%
Calm 0.4%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 6-12
Gender Female, 75.3%
Calm 94.7%
Happy 1.3%
Sad 1%
Fear 0.7%
Surprised 0.6%
Angry 0.5%
Confused 0.5%
Disgusted 0.5%

AWS Rekognition

Age 49-57
Gender Male, 82.8%
Calm 98.9%
Happy 0.5%
Sad 0.3%
Disgusted 0.1%
Confused 0.1%
Fear 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 34-42
Gender Male, 85.5%
Angry 71.9%
Calm 10.3%
Sad 8.4%
Fear 4.3%
Disgusted 2.6%
Surprised 1.5%
Happy 0.6%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Tie
Person 99.3%
Person 99%
Person 98.8%
Person 98.4%
Person 98.1%
Person 93%
Person 92%
Person 88.2%
Person 83.5%
Person 82.6%
Person 81.3%
Person 80%
Person 71.4%
Person 69.2%
Person 66.4%
Tie 67.1%

Categories

Text analysis

Amazon

DO
KODVKSVEEIX