Human Generated Data

Title

Untitled (people at banquet)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20210

Human Generated Data

Title

Untitled (people at banquet)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 96.8
Person 96.2
Person 96.1
Person 95.7
Person 93.8
Person 89.6
Restaurant 87.7
Coat 86.9
Suit 86.9
Apparel 86.9
Clothing 86.9
Overcoat 86.9
Meal 85.2
Food 85.2
Furniture 84.3
Table 84.3
Person 83
Person 81.9
Face 76.5
Person 76.3
Crowd 73.8
Dining Table 73.2
Person 69.9
Person 69.2
People 65.9
Building 63.5
Person 61.5
Cafeteria 61.3
Beverage 60.1
Drink 60.1
Tuxedo 60
Dish 59.7
Bar Counter 58.1
Pub 58.1
Text 57.1
Dating 56.3
Chair 56.3
Person 55.8

Imagga
created on 2022-03-05

percussion instrument 59.1
marimba 54.7
musical instrument 51.9
man 28.9
people 27.3
person 23.4
male 20.6
adult 20.3
men 16.3
teacher 15.4
couple 14.8
happy 14.4
businessman 14.1
silhouette 14.1
business 14
office 13.8
lifestyle 13
group 12.9
sitting 12.9
women 12.7
classroom 12.4
table 12.4
indoors 12.3
room 12.1
vibraphone 11.9
indoor 11.9
team 11.6
smiling 11.6
black 10.8
professional 10.7
stringed instrument 10.6
blackboard 10.5
love 10.3
happiness 10.2
smile 10
cheerful 9.8
chair 9.6
education 9.5
meeting 9.4
water 9.3
two 9.3
modern 9.1
job 8.8
interior 8.8
class 8.7
scene 8.7
work 8.6
desk 8.5
device 8.4
relaxation 8.4
fun 8.2
romantic 8
hall 7.9
discussion 7.8
glass 7.8
youth 7.7
communication 7.6
human 7.5
manager 7.4
mature 7.4
stage 7.4
school 7.3
music 7.2
looking 7.2
sunset 7.2
worker 7.2
portrait 7.1
day 7.1
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.2
person 94.9
man 78.5
black and white 78.3
clothing 74.9
group 71
people 61.3

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 98.7%
Calm 95.2%
Happy 2.6%
Sad 0.9%
Confused 0.6%
Surprised 0.4%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 53-61
Gender Male, 74.7%
Calm 95.3%
Sad 3.2%
Confused 0.7%
Happy 0.3%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.3%
Sad 84.4%
Calm 11.1%
Confused 2%
Happy 0.9%
Surprised 0.7%
Angry 0.4%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 67.4%
Sad 83%
Calm 13%
Confused 1.8%
Happy 0.6%
Disgusted 0.5%
Surprised 0.5%
Angry 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.2%

Captions

Microsoft

a group of people sitting on a bench 70.7%
a group of people sitting at a bench 70.2%
a group of people that are sitting on a bench 58.2%

Text analysis

Amazon

ea
YAGOY