Human Generated Data

Title

Untitled (man giving toast at table)

Date

c. 1950

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19775

Human Generated Data

Title

Untitled (man giving toast at table)

People

Artist: Unidentified Artist,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.2
Person 98.7
Person 98.6
Person 98.5
Person 97.8
Person 95.4
Person 92.7
Indoors 90.9
Interior Design 90.9
Clothing 89.1
Apparel 89.1
Overcoat 76.5
Suit 76.5
Coat 76.5
Room 75.3
People 74
Priest 65
Building 64.7
Architecture 61.7
Bishop 58.2
Sitting 56.4

Imagga
created on 2022-03-05

brass 45.9
wind instrument 41.8
musical instrument 39.4
man 32.9
male 29.1
person 27.7
people 21.2
businessman 18.5
adult 18.5
business 17
group 16.9
men 16.3
couple 14.8
room 13.5
banjo 13.5
stringed instrument 13.3
cornet 13.2
happy 13.1
sax 12.9
home 12.8
teacher 12.5
interior 11.5
senior 11.2
sitting 11.2
women 11.1
speaker 11
music 10.8
handsome 10.7
old 10.4
boy 10.4
black 10.2
happiness 10.2
student 10.2
family 9.8
job 9.7
device 9.6
education 9.5
smiling 9.4
horn 9.3
two 9.3
hand 9.1
new 8.9
indoors 8.8
lifestyle 8.7
table 8.6
stage 8.6
meeting 8.5
modern 8.4
fashion 8.3
confident 8.2
dress 8.1
cheerful 8.1
worker 8.1
religion 8.1
team 8.1
office 8
looking 8
work 8
professional 7.9
hands 7.8
scene 7.8
portrait 7.8
life 7.7
fun 7.5
holding 7.4
articulator 7.4
indoor 7.3
playing 7.3
suit 7.2
art 7.2
night 7.1
together 7

Google
created on 2022-03-05

Black 89.6
Style 83.9
Black-and-white 83.8
Monochrome 74.2
Art 74.1
Suit 73.9
Monochrome photography 73.9
Vintage clothing 73.9
Event 73.6
Room 66.6
History 65.4
Stock photography 65.1
Font 63.7
Crowd 61.5
Visual arts 60.3
Sitting 57.8
Illustration 56.3
Fur 55.2
Photo caption 51.9
Retro style 51.5

Microsoft
created on 2022-03-05

wall 97.2
text 94.6
person 90.2
black and white 85.6
clothing 85.1
woman 54.3
old 45.7
different 33.5
several 15.5

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 99.2%
Sad 65.4%
Happy 19.7%
Calm 6.6%
Confused 3.7%
Angry 1.5%
Disgusted 1.3%
Surprised 1.2%
Fear 0.7%

AWS Rekognition

Age 30-40
Gender Female, 81.2%
Happy 45.8%
Disgusted 20.2%
Sad 12.6%
Surprised 9.3%
Calm 6.9%
Confused 3.1%
Fear 1.2%
Angry 1%

AWS Rekognition

Age 35-43
Gender Female, 72.8%
Happy 92.9%
Sad 4.6%
Confused 0.9%
Calm 0.6%
Surprised 0.3%
Angry 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 48-56
Gender Male, 99.5%
Happy 62.6%
Confused 16.8%
Sad 8.1%
Calm 3.8%
Surprised 2.9%
Disgusted 2.5%
Angry 2.2%
Fear 1.1%

AWS Rekognition

Age 39-47
Gender Female, 55.3%
Calm 37.7%
Confused 36.9%
Sad 11.8%
Happy 3.4%
Angry 2.8%
Surprised 2.7%
Fear 2.4%
Disgusted 2.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people performing on a counter 89.9%
a group of people standing in front of a store 82.1%
a group of people standing in a room 82%

Text analysis

Amazon

45

Google

45
45 MJI7--YTER2--XAa
MJI7--YTER2--XAa