Human Generated Data

Title

Untitled (four women chatting beside large buffet table outdoors)

Date

1959

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9672

Human Generated Data

Title

Untitled (four women chatting beside large buffet table outdoors)

People

Artist: Martin Schweig, American 20th century

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99
Person 99
Person 98.9
Person 98.6
Apparel 93.1
Clothing 93.1
Nature 92.5
Outdoors 92.3
Meal 90.8
Food 90.8
Chair 89.6
Furniture 89.6
Plant 88.9
Tree 78.6
Water 74.4
Face 73.6
Table 72.8
Flower 70.5
Blossom 70.5
Yard 70.2
Art 69.5
Dish 69.2
Dessert 68.1
Creme 68.1
Icing 68.1
Cream 68.1
Cake 68.1
Coat 64.1
Overcoat 64.1
Suit 64.1
Female 63.3
Photo 62.5
Photography 62.5
Person 62.4
Advertisement 61.9
Painting 61.7
Crowd 61.4
Jar 61.4
Pottery 61.4
Potted Plant 61.4
Vase 61.4
Dining Table 59.4
Poster 59.4
Ice 59.2
Bridegroom 56.3
Wedding 56.3
Collage 55.9
Dating 55.9
Man 55.6
Shorts 55.1

Imagga
created on 2022-01-23

cemetery 37.2
building 20.5
old 19.5
architecture 18.7
travel 17.6
city 16.6
snow 14.8
outdoor 14.5
vintage 14.5
grunge 13.6
structure 13.5
gravestone 13.4
water 13.3
winter 12.8
light 12.7
house 12.5
church 12
tree 11.8
history 11.6
park 11.5
landscape 11.2
people 11.2
stone 10.9
black 10.8
tourism 10.7
rural 10.6
art 10.4
antique 10.4
fence 10.3
landmark 9.9
memorial 9.9
wall 9.6
sky 9.6
picket fence 9.4
seller 9.1
religion 9
container 8.8
man 8.7
forest 8.7
flowers 8.7
season 8.6
windowsill 8.5
person 8.4
town 8.3
texture 8.3
new 8.1
romantic 8
holiday 7.9
ashcan 7.9
male 7.9
scene 7.8
ancient 7.8
cold 7.7
grungy 7.6
boat 7.4
retro 7.4
street 7.4
historic 7.3
lake 7.3
bin 7.2
aged 7.2
color 7.2
trees 7.1
paper 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 90.2
black and white 89.4
person 89.4
text 88.9
drawing 83.3
man 79.3
food 61.7
funeral 58.7
old 51.2

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.5%
Sad 75.6%
Happy 18%
Calm 1.9%
Surprised 1.8%
Confused 1.1%
Disgusted 0.8%
Angry 0.6%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Painting 61.7%

Captions

Microsoft

an old photo of a person 68.4%
old photo of a person 64.6%
a old photo of a person 63.3%

Text analysis

Amazon

x
MJIGS-