Human Generated Data

Title

Untitled (men at diner)

Date

c. 1950, printed later

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.183

Human Generated Data

Title

Untitled (men at diner)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

c. 1950, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Person 98.9
Person 98.2
Chair 96.5
Furniture 96.5
Restaurant 96.2
Person 95.3
Person 93.4
Person 91.6
Person 90.3
Person 87.4
Sitting 82.5
Person 81.8
Person 81.2
Chair 80.3
Cafeteria 75.6
Dining Table 67.8
Table 67.8
Cafe 67.1
Food 60.3
Pub 59.3
Advertisement 59
Collage 59
Poster 59
Food Court 58.4
Meal 57.6
Bar Counter 55.5
Person 43.5

Imagga
created on 2022-01-23

equipment 20.7
people 20.1
indoors 18.4
room 18.4
television 18.3
window 18.2
table 17.9
interior 17.7
business 17
office 16.4
desk 16.4
chair 16.2
man 16.1
monitor 15.2
computer 15.2
modern 14
furniture 13.7
person 13.2
male 12.8
home 12
newspaper 11.3
men 11.2
technology 11.1
indoor 10.9
house 10.9
working 10.6
shop 10.3
work 10.2
light 10
silhouette 9.9
blackboard 9.6
black 9.6
design 9.6
inside 9.2
barbershop 8.9
electronic equipment 8.8
adult 8.7
product 8.6
sitting 8.6
glass 8.6
floor 8.4
kitchen 8
restaurant 8
job 8
education 7.8
center 7.5
film 7.4
group 7.2
classroom 7.2
building 7.1
decor 7.1
architecture 7

Google
created on 2022-01-23

Furniture 94.2
Photograph 94.2
Table 93.6
Chair 88.3
Black-and-white 84.7
Style 83.9
Suit 75.5
Snapshot 74.3
Monochrome photography 72.7
Monochrome 72.6
Event 69.7
Room 68.8
Stock photography 63.2
Building 62.1
Font 61
Sitting 60.1
Desk 59.8
Cafeteria 56.9
History 56.8
Vintage clothing 55.4

Microsoft
created on 2022-01-23

text 99.9
table 94.4
black and white 91.6
person 90.9
indoor 90.3
furniture 89.5
clothing 83.3
man 80.6
chair 62
cluttered 11.4

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 97.6%
Happy 98.5%
Confused 0.4%
Calm 0.4%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Sad 0.1%

AWS Rekognition

Age 40-48
Gender Male, 99.7%
Sad 94.2%
Fear 2.6%
Calm 1.1%
Disgusted 0.7%
Angry 0.5%
Confused 0.5%
Surprised 0.3%
Happy 0.1%

AWS Rekognition

Age 68-78
Gender Male, 100%
Calm 73.3%
Surprised 9.7%
Fear 4.7%
Disgusted 3.9%
Angry 3.2%
Sad 2%
Confused 1.9%
Happy 1.3%

AWS Rekognition

Age 56-64
Gender Male, 99.9%
Calm 78.7%
Confused 15.3%
Sad 1.9%
Surprised 1.3%
Angry 1%
Disgusted 0.7%
Fear 0.7%
Happy 0.4%

AWS Rekognition

Age 54-64
Gender Male, 100%
Calm 95.3%
Happy 2.1%
Confused 0.7%
Disgusted 0.6%
Angry 0.5%
Surprised 0.3%
Sad 0.2%
Fear 0.2%

AWS Rekognition

Age 52-60
Gender Male, 99.1%
Calm 98.3%
Surprised 0.5%
Angry 0.4%
Disgusted 0.4%
Confused 0.1%
Sad 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 39-47
Gender Male, 100%
Angry 41.1%
Sad 20.1%
Happy 11.9%
Disgusted 11.4%
Fear 6.6%
Surprised 4.8%
Confused 2.5%
Calm 1.6%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Happy 92.2%
Surprised 2.3%
Sad 1.9%
Calm 1%
Disgusted 0.8%
Angry 0.7%
Fear 0.6%
Confused 0.5%

AWS Rekognition

Age 47-53
Gender Male, 98.7%
Calm 82.7%
Happy 16.2%
Sad 0.4%
Disgusted 0.2%
Confused 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Female, 66.6%
Calm 33.1%
Fear 30.7%
Happy 19.1%
Sad 11.3%
Surprised 2.9%
Angry 1.3%
Disgusted 0.9%
Confused 0.8%

AWS Rekognition

Age 25-35
Gender Female, 96.9%
Disgusted 94.8%
Sad 1.6%
Angry 1.2%
Fear 1.1%
Surprised 0.7%
Confused 0.4%
Happy 0.1%
Calm 0.1%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 97.7%
Sad 1.1%
Angry 0.4%
Confused 0.4%
Surprised 0.2%
Disgusted 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 51-59
Gender Male, 99.8%
Calm 59%
Angry 15.3%
Sad 8.7%
Fear 7.8%
Surprised 3.5%
Happy 3%
Disgusted 1.8%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 96.5%

Captions

Microsoft

a group of people in a room 88.9%
a group of people standing in a room 85.4%
a group of people in a kitchen 71.9%

Text analysis

Amazon

Peggy
Mians

Google

Peggy
Peggy