Human Generated Data

Title

Untitled (governing board meeting)

Date

1937

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22317

Human Generated Data

Title

Untitled (governing board meeting)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22317

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 97.8
Human 97.8
Person 93.8
Furniture 92.3
Person 91.6
Clothing 91.3
Apparel 91.3
Person 88.7
Person 86.3
Person 85.3
Person 81.9
Person 76.3
Person 76
Person 73.8
Painting 69.3
Art 69.3
Table 67.7
Person 66.4
Person 65.9
Face 65.4
Indoors 65
People 62.2
Bed 62
Room 60
Mosquito Net 56.1

Clarifai
created on 2023-10-22

people 96.8
design 96.2
monochrome 96.1
no person 95
wear 93.8
illustration 92.5
inside 91.7
mirror 89.9
science 89.2
indoors 88.4
street 88
desktop 87.7
room 87.6
art 86.8
style 86.6
window 85.9
grey 83.8
abstract 83
group 82.9
square 82.8

Imagga
created on 2022-03-11

balcony 47.5
sketch 32.9
structure 30.3
interior 27.4
house 24.2
architecture 23.9
drawing 23.6
glass 22
modern 21
furniture 20.8
representation 20
home 19.9
design 19.7
chair 16.2
building 16.2
fire screen 16
screen 15.5
metal 15.3
table 15.1
decor 15
window 14.2
room 13.4
decoration 13.3
protective covering 12.6
floor 12.1
luxury 12
wall 11.7
wood 11.7
water 11.3
new 11.3
style 11.1
construction 11.1
covering 10.8
city 10.8
apartment 10.5
food 10.3
urban 9.6
development 9.5
empty 9.4
seat 9.3
outdoor 9.2
tile 9.2
equipment 9.1
business 9.1
steel 8.8
light 8.7
engineering 8.6
elegant 8.6
estate 8.5
3d 8.5
old 8.4
wooden 7.9
indoors 7.9
chairs 7.8
travel 7.7
residential 7.7
patio 7.6
contemporary 7.5
clean 7.5
technology 7.4

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

drawing 95
indoor 92.2
sketch 87.8
black and white 87.3
house 81.6
window 80.6
building 60.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 52.5%
Sad 97.8%
Confused 0.9%
Angry 0.5%
Calm 0.3%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 39-47
Gender Female, 54.9%
Sad 52%
Confused 24.5%
Calm 15.2%
Fear 5.3%
Surprised 0.9%
Happy 0.7%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Sad 91.3%
Confused 5.8%
Calm 2.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 27-37
Gender Female, 85.4%
Calm 66.2%
Sad 17.2%
Confused 10.1%
Disgusted 1.8%
Angry 1.6%
Surprised 1.2%
Happy 1%
Fear 1%

AWS Rekognition

Age 31-41
Gender Male, 81.4%
Sad 36%
Disgusted 21.4%
Confused 20.9%
Calm 13.4%
Fear 3.2%
Happy 2.1%
Angry 1.6%
Surprised 1.4%

AWS Rekognition

Age 7-17
Gender Female, 52.3%
Confused 77.6%
Happy 14.2%
Sad 3%
Calm 2.7%
Disgusted 1%
Angry 0.8%
Fear 0.5%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Bed
Person 97.8%
Person 93.8%
Person 91.6%
Person 88.7%
Person 86.3%
Person 85.3%
Person 81.9%
Person 76.3%
Person 76%
Person 73.8%
Person 66.4%
Person 65.9%
Painting 69.3%
Bed 62%

Categories

Captions

Microsoft
created on 2022-03-11

a close up of a cage 63.8%
close up of a cage 53.6%
a person in a cage 44.3%