Human Generated Data

Title

Untitled (girls basketball team posing on steps)

Date

1925

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1904

Human Generated Data

Title

Untitled (girls basketball team posing on steps)

People

Artist: Hamblin Studio, American active 1930s

Date

1925

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1904

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Handrail 100
Banister 100
Person 99.2
Human 99.2
Person 99
Person 98.6
Staircase 98.2
Apparel 95.3
Clothing 95.3
Person 94.2
Railing 91.7
Person 90.6
Person 87.8
Person 83.6
Person 81.9
Dress 79.6
Person 78.9
Female 77.1
Fashion 69.8
Person 67.5
Robe 64.6
Person 64.4
Gown 62.8
Flower 59.2
Blossom 59.2
Petal 59.2
Plant 59.2
Girl 58.4
Interior Design 57.7
Indoors 57.7
Woman 56
Person 51.9
Person 48.2

Clarifai
created on 2023-10-25

step 100
people 99.9
adult 98.6
man 98.3
leader 97.5
administration 97.2
group 97.1
many 96
group together 94.2
woman 93.9
monochrome 85.8
one 84.8
several 84.4
ceremony 82.4
child 79.2
chair 77.2
stump 76.3
war 74.5
steps 74.1
theater 72.9

Imagga
created on 2021-12-14

stairs 53.8
step 47.9
support 44.2
barrier 32.9
device 31.9
architecture 29.8
structure 29.7
building 26.1
interior 25.6
obstruction 23.9
house 23.4
modern 18.9
floor 18.6
home 18.3
room 17
people 15.1
inside 14.7
indoor 14.6
tread 14.3
city 14.1
man 14.1
design 13.5
window 13
construction 12.8
travel 12.7
structural member 12.4
indoors 12.3
lifestyle 12.3
corridor 11.8
life 11.7
business 11.5
apartment 11.5
wooden 11.4
urban 11.4
old 11.1
hall 10.8
decor 10.6
office 10.4
kin 10.4
wall 10.3
stone 10.1
nobody 10.1
3d 10.1
balcony 10.1
furniture 10
light 10
wood 10
stairway 9.9
staircase 9.8
steps 9.8
entrance 9.7
table 9.5
railing 9.4
luxury 9.4
casual 9.3
portrait 9.1
walkway 8.8
person 8.8
couple 8.7
walking 8.5
male 8.5
adult 8.4
exterior 8.3
transport 8.2
vacation 8.2
style 8.2
metal 8
decoration 8
stair 7.9
passage 7.8
chair 7.8
day 7.8
glass 7.8
concrete 7.6
walk 7.6
living 7.6
relax 7.6
happy 7.5
tourism 7.4
women 7.1
love 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.9
stairs 89.7
black 72.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-52
Gender Female, 99.5%
Happy 57.8%
Calm 18.2%
Sad 13.7%
Fear 5%
Disgusted 2.9%
Angry 0.9%
Confused 0.9%
Surprised 0.6%

AWS Rekognition

Age 29-45
Gender Female, 51.2%
Happy 36.3%
Disgusted 20.5%
Calm 19.7%
Sad 7.5%
Fear 5.5%
Angry 4.2%
Confused 4.1%
Surprised 2.2%

AWS Rekognition

Age 46-64
Gender Female, 73%
Calm 77.4%
Happy 11.4%
Sad 6.5%
Confused 1.3%
Disgusted 1%
Surprised 1%
Fear 1%
Angry 0.4%

AWS Rekognition

Age 50-68
Gender Female, 56.4%
Calm 47.6%
Happy 34.7%
Disgusted 8.3%
Confused 5.2%
Sad 2.3%
Angry 0.9%
Surprised 0.8%
Fear 0.2%

AWS Rekognition

Age 47-65
Gender Female, 87.2%
Happy 46.9%
Calm 16.8%
Confused 7.7%
Sad 7.3%
Surprised 6.1%
Fear 5.7%
Angry 5.5%
Disgusted 3.9%

AWS Rekognition

Age 33-49
Gender Female, 87.5%
Sad 32.4%
Fear 17.9%
Calm 15.1%
Disgusted 15.1%
Happy 10.8%
Angry 4.2%
Confused 2.6%
Surprised 1.9%

AWS Rekognition

Age 38-56
Gender Male, 70.5%
Sad 88.5%
Calm 8.6%
Confused 1.3%
Happy 0.7%
Angry 0.5%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 36-54
Gender Female, 87.4%
Happy 47.4%
Calm 39.8%
Sad 5.4%
Disgusted 3.5%
Fear 2.2%
Angry 0.6%
Surprised 0.6%
Confused 0.5%

AWS Rekognition

Age 38-56
Gender Female, 95.2%
Calm 40.9%
Happy 15.8%
Sad 13.4%
Confused 10.2%
Surprised 7.6%
Fear 7.4%
Disgusted 2.8%
Angry 1.8%

AWS Rekognition

Age 53-71
Gender Male, 80.9%
Calm 28.5%
Confused 19.4%
Surprised 15.9%
Happy 14.8%
Fear 8.5%
Sad 8.3%
Angry 3.4%
Disgusted 1.1%

AWS Rekognition

Age 22-34
Gender Female, 95.9%
Calm 32.2%
Sad 28.1%
Happy 16%
Surprised 7.9%
Confused 7.5%
Fear 5.4%
Disgusted 2.1%
Angry 0.8%

AWS Rekognition

Age 45-63
Gender Female, 79.1%
Happy 64.7%
Calm 23.3%
Sad 8.1%
Confused 3.2%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 39-57
Gender Female, 68.1%
Calm 74.8%
Happy 11.8%
Sad 4.6%
Surprised 3.3%
Confused 3.2%
Angry 0.9%
Fear 0.7%
Disgusted 0.6%

AWS Rekognition

Age 23-37
Gender Male, 83%
Calm 26.6%
Surprised 16%
Fear 15.6%
Sad 12.5%
Confused 12%
Happy 8.5%
Angry 5.2%
Disgusted 3.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.2%
Staircase 98.2%

Categories

Imagga

paintings art 93.8%
interior objects 5.8%