Human Generated Data

Title

Untitled (five women in restaurant booth)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6275

Human Generated Data

Title

Untitled (five women in restaurant booth)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6275

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Clothing 99.6
Apparel 99.6
Person 99.4
Chair 94
Furniture 94
Female 91
Restaurant 90.8
Meal 87
Food 87
Dress 83.2
Woman 82.4
Sleeve 80.1
Gown 75.7
Robe 75.7
Fashion 75.7
Evening Dress 75.7
Veil 73.8
Plant 70.9
Table 70.4
Cafe 69.2
Sitting 69.1
Cafeteria 68.7
Blonde 68.6
Teen 68.6
Kid 68.6
Girl 68.6
Child 68.6
Dish 62.4
Photography 61.2
Photo 61.2
Indoors 60
Wedding 57.9
Suit 57.7
Coat 57.7
Overcoat 57.7
Dining Table 57.5
Flower 57.5
Blossom 57.5
Room 56.6
Long Sleeve 55.6

Clarifai
created on 2023-10-26

people 99.6
monochrome 97.9
woman 96.6
adult 96
man 92.5
group 91.8
two 91.5
furniture 89
sit 88.2
music 87
child 83.2
wear 81.6
instrument 77.7
group together 77.2
administration 76.7
chair 74.4
three 74.3
actress 74.1
adolescent 74
room 73.8

Imagga
created on 2022-01-22

man 41.6
brass 31.2
people 30.6
male 29
room 28.3
wind instrument 26.4
indoors 23.7
person 23.1
office 22.9
senior 21.5
businessman 20.3
computer 19.2
business 18.8
adult 18.7
men 18
musical instrument 17.9
home 17.5
sitting 17.1
laptop 16.6
elderly 16.2
meeting 16
cornet 15.6
executive 15.6
old 15.3
work 14.9
couple 14.8
desk 14.1
photographer 14.1
table 13.8
retired 13.6
chair 13.3
interior 13.2
mature 13
group 12.9
indoor 12.8
communication 12.6
professional 12.4
working 12.4
smiling 12.3
lifestyle 12.3
device 12.2
job 11.5
classroom 11.4
together 11.4
women 11.1
inside 11
horizontal 10.9
aged 10.8
team 10.7
retirement 10.5
businesspeople 10.4
occupation 10.1
worker 9.9
employee 9.7
portrait 9.7
looking 9.6
corporate 9.4
happy 9.4
handsome 8.9
restaurant 8.8
equipment 8.8
happiness 8.6
casual 8.5
modern 8.4
teamwork 8.3
trombone 8.3
camera 8.3
businesswoman 8.2
suit 8.1
nurse 7.8
40s 7.8
colleagues 7.8
hand 7.6
enjoyment 7.5
glasses 7.4
color 7.2
smile 7.1
patient 7.1
face 7.1
hall 7.1
medical 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 91.1
text 88.6
window 86.8
black and white 57
clothing 53.5
table 27.5
desk 5.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Female, 99.8%
Calm 99.9%
Sad 0.1%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 35-43
Gender Female, 93.8%
Calm 79.2%
Happy 9.6%
Surprised 4.9%
Confused 3.5%
Sad 1.2%
Angry 0.9%
Disgusted 0.5%
Fear 0.2%

AWS Rekognition

Age 48-56
Gender Male, 88.9%
Happy 96%
Confused 1.6%
Sad 0.8%
Surprised 0.7%
Calm 0.5%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%

Feature analysis

Amazon

Person 99.7%

Text analysis

Amazon

as