Human Generated Data

Title

Untitled (outdoor restaurant)

Date

1976

People

Artist: Sage Sohier, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1146

Copyright

© Sage Sohier

Human Generated Data

Title

Untitled (outdoor restaurant)

People

Artist: Sage Sohier, American born 1954

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1146

Copyright

© Sage Sohier

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Restaurant 99.4
Person 99.3
Human 99.3
Person 99.1
Person 98.6
Meal 93.7
Food 93.7
Person 92.8
Person 89.6
Cafe 88.8
Cafeteria 87.6
Home Decor 81
Food Court 79.3
Interior Design 67.3
Indoors 67.3
Sitting 58.1
Couch 57.4
Furniture 57.4

Clarifai
created on 2023-10-25

people 99.7
woman 98
group 96.8
monochrome 96.7
table 96.6
collage 96.3
group together 95.8
recreation 95.3
man 94.8
beach 94.7
girl 94.6
wedding 94.6
restaurant 94.6
adult 94
sea 92.9
water 92.8
chair 92.5
child 91.9
bride 91.2
furniture 90.4

Imagga
created on 2022-01-09

interior 23.8
shop 19
modern 18.2
salon 17.1
people 16.2
house 15
indoors 14.9
inside 14.7
chair 13.4
table 13.2
building 13.2
counter 13
design 12.9
architecture 12.6
window 12.3
furniture 12.3
man 12.1
indoor 11.9
business 11.5
office 11.4
light 11.3
computer 11.3
travel 11.3
home 11.2
men 11.1
mercantile establishment 11.1
room 10.4
black 10.2
male 9.9
musical instrument 9.9
transportation 9.9
urban 9.6
structure 9.4
equipment 9.4
coat hanger 9.4
glass 9.3
person 9.3
floor 9.3
transport 9.1
silhouette 9.1
support 9.1
stringed instrument 8.8
work 8.8
women 8.7
guitar 8.4
kitchen 8.3
technology 8.2
style 8.1
working 7.9
chairs 7.8
relaxation 7.5
place of business 7.5
shoe shop 7.5
hanger 7.5
tourism 7.4
barbershop 7.4
water 7.3
life 7.3
group 7.2
lifestyle 7.2
art 7.1
basket 7.1
adult 7.1
worker 7.1
decor 7.1
sky 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 88.7
table 87.4
different 82.7
person 79.9
clothing 70.6
black and white 66.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 99.1%
Calm 97%
Surprised 0.8%
Fear 0.5%
Disgusted 0.5%
Sad 0.4%
Confused 0.4%
Angry 0.3%
Happy 0.1%

AWS Rekognition

Age 18-24
Gender Male, 84.3%
Confused 51.2%
Sad 30%
Calm 9%
Fear 3.3%
Angry 2.5%
Disgusted 1.7%
Happy 1.2%
Surprised 1%

AWS Rekognition

Age 23-33
Gender Male, 57.1%
Sad 96.5%
Calm 3.3%
Disgusted 0%
Confused 0%
Fear 0%
Surprised 0%
Happy 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories