Human Generated Data

Title

Untitled (elevated view of church interior with people leaving pews and priest at front)

Date

1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9328

Human Generated Data

Title

Untitled (elevated view of church interior with people leaving pews and priest at front)

People

Artist: Martin Schweig, American 20th century

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9328

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Interior Design 93.3
Indoors 93.3
Person 86.4
Human 86.4
Person 81.8
Person 80.2
Art 78.5
Person 77.2
Nature 69.7
Paper 68.4
Person 68
Chair 66.3
Furniture 66.3
Advertisement 65.9
Chair 65.1
Outdoors 64.8
Person 64.3
Room 63.7
Chair 62.2
Person 61.3
Poster 60.4
Person 49

Clarifai
created on 2023-10-27

people 99.6
group 98.9
crowd 98.9
many 97.7
man 97.3
adult 94.8
desktop 94.5
woman 92.5
leader 92.2
group together 91.4
illustration 89.6
street 86.4
silhouette 85.3
business 85
audience 84.2
meeting 81.3
art 80.8
music 80.1
child 79.7
modern 78.9

Imagga
created on 2022-01-23

shower curtain 70.2
curtain 55.7
furnishing 42.4
blind 42.1
glass 37.4
protective covering 29.3
texture 27.8
grunge 23.8
pattern 20.5
design 19.7
surface 17.6
art 17.4
backdrop 16.5
covering 15.2
black 15
water 14.7
graphic 14.6
wallpaper 14.5
floral 14.5
textured 14
old 13.9
material 13.4
cool 13.3
retro 13.1
liquid 13
vintage 12.7
drop 12.7
flower 12.3
rough 11.8
dirty 11.7
frame 11.6
backgrounds 11.3
plant 11.2
clean 10.9
border 10.8
close 10.3
aged 9.9
container 9.9
transparent 9.8
detail 9.6
structure 9.1
wet 8.9
color 8.9
cold 8.6
leaf 8.6
grungy 8.5
decoration 8.5
plants 8.3
grain 8.3
window 8.2
effect 8.2
light 8
colors 7.9
wall 7.8
antique 7.8
edge 7.7
aqua 7.6
tray 7.5
silhouette 7.4
element 7.4
closeup 7.4
reflection 7.3
lines 7.2
paper 7.2
colorful 7.2
creative 7.1
medicine 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.8
drawing 98.4
sketch 93.9
black and white 67.3
group 65.4
posing 62.6
map 53.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Male, 98.5%
Calm 87%
Happy 5.4%
Sad 4.3%
Confused 2.1%
Disgusted 0.5%
Angry 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Male, 99.5%
Calm 81%
Sad 16.6%
Happy 0.9%
Angry 0.4%
Fear 0.3%
Surprised 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 29-39
Gender Female, 95.1%
Happy 94.3%
Calm 4.2%
Disgusted 0.5%
Confused 0.3%
Surprised 0.2%
Angry 0.2%
Fear 0.2%
Sad 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.4%
Calm 61.2%
Sad 33.3%
Disgusted 2.8%
Confused 1.1%
Surprised 0.5%
Angry 0.5%
Fear 0.4%
Happy 0.3%

AWS Rekognition

Age 27-37
Gender Female, 61.4%
Sad 44.1%
Disgusted 29.1%
Confused 10.5%
Calm 9.7%
Surprised 2.1%
Angry 2%
Happy 1.3%
Fear 1.2%

AWS Rekognition

Age 29-39
Gender Female, 67.1%
Calm 79.7%
Happy 13.6%
Sad 3.6%
Confused 0.8%
Angry 0.6%
Fear 0.6%
Disgusted 0.6%
Surprised 0.5%

AWS Rekognition

Age 21-29
Gender Female, 57.7%
Calm 99.9%
Happy 0%
Confused 0%
Sad 0%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 71.4%
Calm 97.9%
Angry 0.6%
Sad 0.6%
Confused 0.3%
Disgusted 0.3%
Happy 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Female, 66%
Calm 62.8%
Sad 26.2%
Confused 2.8%
Angry 2.6%
Happy 2.1%
Fear 1.3%
Disgusted 1.2%
Surprised 1%

Feature analysis

Amazon

Person 86.4%
Chair 66.3%

Categories

Text analysis

Amazon

NACO
so
YT33AP NACO
YT33AP
In
2607 In
2607

Google

NA
NA